Feb 17 14:52:12 crc systemd[1]: Starting Kubernetes Kubelet... Feb 17 14:52:12 crc restorecon[4690]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:12 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:52:13 crc restorecon[4690]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:52:13 crc restorecon[4690]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 17 14:52:15 crc kubenswrapper[4717]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 14:52:15 crc kubenswrapper[4717]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 17 14:52:15 crc kubenswrapper[4717]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 14:52:15 crc kubenswrapper[4717]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 14:52:15 crc kubenswrapper[4717]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 17 14:52:15 crc kubenswrapper[4717]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.262381 4717 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267720 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267752 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267762 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267770 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267779 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267790 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267800 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267808 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267816 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267825 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267836 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267847 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267856 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267865 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267874 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267883 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267892 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267901 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267909 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267917 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267924 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267934 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267943 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267953 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267963 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267971 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267980 4717 feature_gate.go:330] unrecognized feature gate: Example Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267988 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.267995 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268003 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268011 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268019 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268027 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268037 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268047 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268057 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268067 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268116 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268127 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268136 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268145 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268154 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268162 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268170 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268178 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268186 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268194 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268203 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268210 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268218 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268226 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268234 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268242 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268249 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268260 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268269 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268277 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268288 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268298 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268306 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268313 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268321 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268333 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268343 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268352 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268359 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268367 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268375 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268383 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268391 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.268398 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268535 4717 flags.go:64] FLAG: --address="0.0.0.0" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268550 4717 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268564 4717 flags.go:64] FLAG: --anonymous-auth="true" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268579 4717 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268590 4717 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268599 4717 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268610 4717 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268621 4717 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268631 4717 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268640 4717 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268649 4717 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268659 4717 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268669 4717 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268679 4717 flags.go:64] FLAG: --cgroup-root="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268687 4717 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268696 4717 flags.go:64] FLAG: --client-ca-file="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268704 4717 flags.go:64] FLAG: --cloud-config="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268713 4717 flags.go:64] FLAG: --cloud-provider="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268722 4717 flags.go:64] FLAG: --cluster-dns="[]" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268734 4717 flags.go:64] FLAG: --cluster-domain="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268743 4717 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268752 4717 flags.go:64] FLAG: --config-dir="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268760 4717 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268770 4717 flags.go:64] FLAG: --container-log-max-files="5" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268782 4717 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268791 4717 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268801 4717 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268811 4717 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268820 4717 flags.go:64] FLAG: --contention-profiling="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268829 4717 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268838 4717 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268847 4717 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268856 4717 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268866 4717 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268876 4717 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268885 4717 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268893 4717 flags.go:64] FLAG: --enable-load-reader="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268902 4717 flags.go:64] FLAG: --enable-server="true" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268911 4717 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268924 4717 flags.go:64] FLAG: --event-burst="100" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268933 4717 flags.go:64] FLAG: --event-qps="50" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268943 4717 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268952 4717 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268960 4717 flags.go:64] FLAG: --eviction-hard="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268971 4717 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268980 4717 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268989 4717 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.268998 4717 flags.go:64] FLAG: --eviction-soft="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269007 4717 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269016 4717 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269027 4717 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269037 4717 flags.go:64] FLAG: --experimental-mounter-path="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269046 4717 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269056 4717 flags.go:64] FLAG: --fail-swap-on="true" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269065 4717 flags.go:64] FLAG: --feature-gates="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269106 4717 flags.go:64] FLAG: --file-check-frequency="20s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269115 4717 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269124 4717 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269134 4717 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269143 4717 flags.go:64] FLAG: --healthz-port="10248" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269153 4717 flags.go:64] FLAG: --help="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269162 4717 flags.go:64] FLAG: --hostname-override="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269170 4717 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269179 4717 flags.go:64] FLAG: --http-check-frequency="20s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269188 4717 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269197 4717 flags.go:64] FLAG: --image-credential-provider-config="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269206 4717 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269214 4717 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269223 4717 flags.go:64] FLAG: --image-service-endpoint="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269232 4717 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269241 4717 flags.go:64] FLAG: --kube-api-burst="100" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269250 4717 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269261 4717 flags.go:64] FLAG: --kube-api-qps="50" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269272 4717 flags.go:64] FLAG: --kube-reserved="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269283 4717 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269296 4717 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269307 4717 flags.go:64] FLAG: --kubelet-cgroups="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269317 4717 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269328 4717 flags.go:64] FLAG: --lock-file="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269339 4717 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269351 4717 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269365 4717 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269383 4717 flags.go:64] FLAG: --log-json-split-stream="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269394 4717 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269406 4717 flags.go:64] FLAG: --log-text-split-stream="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269418 4717 flags.go:64] FLAG: --logging-format="text" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269428 4717 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269441 4717 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269451 4717 flags.go:64] FLAG: --manifest-url="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269462 4717 flags.go:64] FLAG: --manifest-url-header="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269476 4717 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269488 4717 flags.go:64] FLAG: --max-open-files="1000000" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269501 4717 flags.go:64] FLAG: --max-pods="110" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269513 4717 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269522 4717 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269531 4717 flags.go:64] FLAG: --memory-manager-policy="None" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269540 4717 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269549 4717 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269558 4717 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269567 4717 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269591 4717 flags.go:64] FLAG: --node-status-max-images="50" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269601 4717 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269612 4717 flags.go:64] FLAG: --oom-score-adj="-999" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269622 4717 flags.go:64] FLAG: --pod-cidr="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269634 4717 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269651 4717 flags.go:64] FLAG: --pod-manifest-path="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269662 4717 flags.go:64] FLAG: --pod-max-pids="-1" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269748 4717 flags.go:64] FLAG: --pods-per-core="0" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269761 4717 flags.go:64] FLAG: --port="10250" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269773 4717 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269784 4717 flags.go:64] FLAG: --provider-id="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269797 4717 flags.go:64] FLAG: --qos-reserved="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269809 4717 flags.go:64] FLAG: --read-only-port="10255" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269821 4717 flags.go:64] FLAG: --register-node="true" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269831 4717 flags.go:64] FLAG: --register-schedulable="true" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269845 4717 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269864 4717 flags.go:64] FLAG: --registry-burst="10" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269875 4717 flags.go:64] FLAG: --registry-qps="5" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269884 4717 flags.go:64] FLAG: --reserved-cpus="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269893 4717 flags.go:64] FLAG: --reserved-memory="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269904 4717 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269913 4717 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269922 4717 flags.go:64] FLAG: --rotate-certificates="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269932 4717 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269940 4717 flags.go:64] FLAG: --runonce="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269950 4717 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269959 4717 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269968 4717 flags.go:64] FLAG: --seccomp-default="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269977 4717 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269986 4717 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.269995 4717 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270005 4717 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270014 4717 flags.go:64] FLAG: --storage-driver-password="root" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270024 4717 flags.go:64] FLAG: --storage-driver-secure="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270033 4717 flags.go:64] FLAG: --storage-driver-table="stats" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270042 4717 flags.go:64] FLAG: --storage-driver-user="root" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270052 4717 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270062 4717 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270072 4717 flags.go:64] FLAG: --system-cgroups="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270112 4717 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270127 4717 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270136 4717 flags.go:64] FLAG: --tls-cert-file="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270144 4717 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270158 4717 flags.go:64] FLAG: --tls-min-version="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270169 4717 flags.go:64] FLAG: --tls-private-key-file="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270180 4717 flags.go:64] FLAG: --topology-manager-policy="none" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270190 4717 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270203 4717 flags.go:64] FLAG: --topology-manager-scope="container" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270215 4717 flags.go:64] FLAG: --v="2" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270230 4717 flags.go:64] FLAG: --version="false" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270243 4717 flags.go:64] FLAG: --vmodule="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270253 4717 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.270263 4717 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270493 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270503 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270512 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270521 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270530 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270541 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270551 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270559 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270568 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270578 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270587 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270596 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270605 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270613 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270621 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270630 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270639 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270646 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270654 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270662 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270670 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270678 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270685 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270693 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270700 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270708 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270717 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270724 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270732 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270739 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270751 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270761 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270769 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270777 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270785 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270793 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270801 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270809 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270816 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270824 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270832 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270840 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270849 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270857 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270864 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270872 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270880 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270888 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270895 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270905 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270915 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270924 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270934 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270944 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270952 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270961 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270969 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270978 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270987 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.270996 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.271004 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.271011 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.271019 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.271027 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.271035 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.271042 4717 feature_gate.go:330] unrecognized feature gate: Example Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.271051 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.271059 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.271067 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.271074 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.271108 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.271131 4717 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.303396 4717 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.303492 4717 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303676 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303697 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303709 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303718 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303728 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303738 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303746 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303756 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303765 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303774 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303784 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303793 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303804 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303813 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303821 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303831 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303840 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303849 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303858 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303867 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303877 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303887 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303896 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303905 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303914 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303925 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303934 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303943 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303955 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303964 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303975 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303984 4717 feature_gate.go:330] unrecognized feature gate: Example Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.303994 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304004 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304042 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304053 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304063 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304072 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304131 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304142 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304151 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304161 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304175 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304197 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304208 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304218 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304231 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304242 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304254 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304265 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304274 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304287 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304299 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304308 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304318 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304328 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304340 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304353 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304364 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304375 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304386 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304395 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304404 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304413 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304424 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304434 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304443 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304452 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304460 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304469 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304493 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.304510 4717 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304843 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304859 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304922 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304935 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304946 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.304956 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305035 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305045 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305055 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305064 4717 feature_gate.go:330] unrecognized feature gate: Example Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305073 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305110 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305120 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305129 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305139 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305149 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305158 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305170 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305183 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305193 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305203 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305213 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305223 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305234 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305243 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305256 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305268 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305278 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305288 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305297 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305306 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305315 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305324 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305333 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305359 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305369 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305378 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305388 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305397 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305406 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305416 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305427 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305436 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305449 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305460 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305472 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305484 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305495 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305504 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305514 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305523 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305532 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305541 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305550 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305559 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305570 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305580 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305590 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305603 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305613 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305624 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305634 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305643 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305653 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305662 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305675 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305684 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305693 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305738 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305749 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.305774 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.305789 4717 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.306246 4717 server.go:940] "Client rotation is on, will bootstrap in background" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.325969 4717 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.326196 4717 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.339734 4717 server.go:997] "Starting client certificate rotation" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.339793 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.366920 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-10 05:56:19.29791897 +0000 UTC Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.367073 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.485401 4717 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 14:52:15 crc kubenswrapper[4717]: E0217 14:52:15.490149 4717 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.501656 4717 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.550388 4717 log.go:25] "Validated CRI v1 runtime API" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.721207 4717 log.go:25] "Validated CRI v1 image API" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.723836 4717 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.729764 4717 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-17-14-47-59-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.729818 4717 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.756524 4717 manager.go:217] Machine: {Timestamp:2026-02-17 14:52:15.753567145 +0000 UTC m=+2.169407661 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7c6444a0-8f9e-4f16-931b-2a332675c205 BootID:8454a012-5158-4995-9509-e8fe74ab1270 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:8f:4a:97 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:8f:4a:97 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:5f:4e:76 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:af:5f:b8 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:7d:f1:55 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ea:18:a6 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:de:bd:48:bb:51:b6 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fa:d3:aa:cd:b6:ad Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.756890 4717 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.757173 4717 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.757547 4717 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.757794 4717 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.757838 4717 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.758175 4717 topology_manager.go:138] "Creating topology manager with none policy" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.758189 4717 container_manager_linux.go:303] "Creating device plugin manager" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.758644 4717 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.758687 4717 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.759597 4717 state_mem.go:36] "Initialized new in-memory state store" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.759728 4717 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.763162 4717 kubelet.go:418] "Attempting to sync node with API server" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.763194 4717 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.763212 4717 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.763227 4717 kubelet.go:324] "Adding apiserver pod source" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.763252 4717 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.771316 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.771387 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:15 crc kubenswrapper[4717]: E0217 14:52:15.771498 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:15 crc kubenswrapper[4717]: E0217 14:52:15.771536 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.773360 4717 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.774732 4717 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.777175 4717 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.778684 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.778711 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.778719 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.778726 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.778741 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.778750 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.778758 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.778769 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.778779 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.778788 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.778806 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.778814 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.779463 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.780007 4717 server.go:1280] "Started kubelet" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.780273 4717 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.780419 4717 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.781207 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.781354 4717 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 17 14:52:15 crc systemd[1]: Started Kubernetes Kubelet. Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.782996 4717 server.go:460] "Adding debug handlers to kubelet server" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.785934 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.785983 4717 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.786283 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 10:59:26.717774359 +0000 UTC Feb 17 14:52:15 crc kubenswrapper[4717]: E0217 14:52:15.786414 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.786489 4717 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.786500 4717 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.786562 4717 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 17 14:52:15 crc kubenswrapper[4717]: E0217 14:52:15.786804 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="200ms" Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.787190 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:15 crc kubenswrapper[4717]: E0217 14:52:15.787750 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.787829 4717 factory.go:55] Registering systemd factory Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.787855 4717 factory.go:221] Registration of the systemd container factory successfully Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.788553 4717 factory.go:153] Registering CRI-O factory Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.788574 4717 factory.go:221] Registration of the crio container factory successfully Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.788641 4717 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.788666 4717 factory.go:103] Registering Raw factory Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.788685 4717 manager.go:1196] Started watching for new ooms in manager Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.789337 4717 manager.go:319] Starting recovery of all containers Feb 17 14:52:15 crc kubenswrapper[4717]: E0217 14:52:15.786307 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.74:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1895104960a0e903 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 14:52:15.779973379 +0000 UTC m=+2.195813855,LastTimestamp:2026-02-17 14:52:15.779973379 +0000 UTC m=+2.195813855,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.805848 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.805934 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.805947 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.805960 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.805989 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806006 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806024 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806038 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806055 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806066 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806142 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806160 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806175 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806194 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806208 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806232 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806248 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806311 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806324 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806338 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806357 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806372 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806386 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806398 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806415 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806428 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806443 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806456 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806469 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806485 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806500 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806514 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806532 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806545 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806556 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806568 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806580 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806591 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806604 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806615 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806627 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806640 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806654 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806666 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806681 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806699 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806778 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806790 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806801 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806812 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806827 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806842 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806861 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806873 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806887 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806899 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806911 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806923 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806933 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806944 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806953 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806966 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806978 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.806990 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.807002 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.807013 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.807027 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.807039 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.807051 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.807063 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.807094 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.807106 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810237 4717 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810287 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810313 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810335 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810356 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810380 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810402 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810426 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810448 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810467 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810489 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810509 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810535 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810555 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810577 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810597 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810617 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810639 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810662 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810684 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810706 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810731 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810752 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810771 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810791 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810809 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810830 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810851 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810872 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810894 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810917 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810935 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810950 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810973 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.810991 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811006 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811024 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811042 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811059 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811104 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811129 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811147 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811164 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811180 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811195 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811212 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811230 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811246 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811266 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811282 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811297 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811313 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811328 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811347 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811365 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811383 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811399 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811415 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811439 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811453 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811469 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811492 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811507 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811524 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811539 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811555 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811569 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811583 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811597 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811615 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811631 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811646 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811661 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811674 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811688 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811703 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811718 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811734 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811750 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811764 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811782 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811800 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811817 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811837 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811873 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811899 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811926 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811949 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811972 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.811995 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812018 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812038 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812059 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812101 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812126 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812145 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812164 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812184 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812213 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812233 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812254 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812279 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812296 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812311 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812325 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812340 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812355 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812368 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812382 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812398 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812414 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812430 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812443 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812457 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812473 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812488 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812502 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812518 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812531 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812545 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812559 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812574 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812594 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812610 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812630 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812650 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812668 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812687 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812705 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812721 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812738 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812754 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812773 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812795 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812812 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812829 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812845 4717 reconstruct.go:97] "Volume reconstruction finished" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812857 4717 reconciler.go:26] "Reconciler: start to sync state" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.812868 4717 manager.go:324] Recovery completed Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.822004 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.823673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.823713 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.823746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.824841 4717 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.824855 4717 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.824876 4717 state_mem.go:36] "Initialized new in-memory state store" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.842841 4717 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.843381 4717 policy_none.go:49] "None policy: Start" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.845028 4717 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.845054 4717 state_mem.go:35] "Initializing new in-memory state store" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.845393 4717 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.845447 4717 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.845491 4717 kubelet.go:2335] "Starting kubelet main sync loop" Feb 17 14:52:15 crc kubenswrapper[4717]: W0217 14:52:15.846305 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:15 crc kubenswrapper[4717]: E0217 14:52:15.846377 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:15 crc kubenswrapper[4717]: E0217 14:52:15.846456 4717 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 17 14:52:15 crc kubenswrapper[4717]: E0217 14:52:15.886501 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.906309 4717 manager.go:334] "Starting Device Plugin manager" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.906378 4717 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.906399 4717 server.go:79] "Starting device plugin registration server" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.907172 4717 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.907251 4717 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.907573 4717 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.907768 4717 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.907778 4717 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 17 14:52:15 crc kubenswrapper[4717]: E0217 14:52:15.919109 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.947399 4717 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.947546 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.949217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.949268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.949277 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.949499 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.950458 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.950578 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.951963 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.952016 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.952035 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.952265 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.952779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.952951 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.952979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.953526 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.953766 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.955661 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.955696 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.955729 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.955843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.955874 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.955883 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.956025 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.956186 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.956230 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.957563 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.957601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.957617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.957828 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.958061 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.958163 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.958561 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.958608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.958620 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.959048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.959113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.959122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.959658 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.959771 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.960768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.960806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.960153 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.965063 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.965178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:15 crc kubenswrapper[4717]: I0217 14:52:15.965201 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:15 crc kubenswrapper[4717]: E0217 14:52:15.987827 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="400ms" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.007741 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.009069 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.009163 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.009184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.009233 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:52:16 crc kubenswrapper[4717]: E0217 14:52:16.010042 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.74:6443: connect: connection refused" node="crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016261 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016319 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016367 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016402 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016439 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016530 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016594 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016659 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016706 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016753 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016798 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016827 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016878 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016925 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.016968 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.117702 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.117777 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.117813 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.117846 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.117904 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.117938 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.117973 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118003 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118010 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118169 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118197 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118226 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118270 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118279 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118039 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118228 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118415 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118305 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118463 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118499 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118549 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118672 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118766 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118809 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118872 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118895 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118894 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118957 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.118996 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.119135 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.211216 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.213156 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.213237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.213252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.213289 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:52:16 crc kubenswrapper[4717]: E0217 14:52:16.214139 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.74:6443: connect: connection refused" node="crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.280584 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.287580 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.304189 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.325368 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.329737 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:16 crc kubenswrapper[4717]: W0217 14:52:16.344551 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-d86325926430910f9437b14fb839de697fd363cad479d894e449493c0df01014 WatchSource:0}: Error finding container d86325926430910f9437b14fb839de697fd363cad479d894e449493c0df01014: Status 404 returned error can't find the container with id d86325926430910f9437b14fb839de697fd363cad479d894e449493c0df01014 Feb 17 14:52:16 crc kubenswrapper[4717]: W0217 14:52:16.346353 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-d2256ad57d1d9378fbd857e046de50c9d4d6fa7142de273c2973481686933b40 WatchSource:0}: Error finding container d2256ad57d1d9378fbd857e046de50c9d4d6fa7142de273c2973481686933b40: Status 404 returned error can't find the container with id d2256ad57d1d9378fbd857e046de50c9d4d6fa7142de273c2973481686933b40 Feb 17 14:52:16 crc kubenswrapper[4717]: W0217 14:52:16.351927 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-9caf67262b025fcd6393cb81668b5f73bbaee2842aea2e1865783a41201753af WatchSource:0}: Error finding container 9caf67262b025fcd6393cb81668b5f73bbaee2842aea2e1865783a41201753af: Status 404 returned error can't find the container with id 9caf67262b025fcd6393cb81668b5f73bbaee2842aea2e1865783a41201753af Feb 17 14:52:16 crc kubenswrapper[4717]: W0217 14:52:16.353819 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6de7898b02a669b96bc50909c71d7acd3f5f3299f8295fea006ebfba9df12750 WatchSource:0}: Error finding container 6de7898b02a669b96bc50909c71d7acd3f5f3299f8295fea006ebfba9df12750: Status 404 returned error can't find the container with id 6de7898b02a669b96bc50909c71d7acd3f5f3299f8295fea006ebfba9df12750 Feb 17 14:52:16 crc kubenswrapper[4717]: W0217 14:52:16.358312 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-c2bffc9459310fa3da96c3ae59f6128c3352e47efc0578990248e1f0d7fab550 WatchSource:0}: Error finding container c2bffc9459310fa3da96c3ae59f6128c3352e47efc0578990248e1f0d7fab550: Status 404 returned error can't find the container with id c2bffc9459310fa3da96c3ae59f6128c3352e47efc0578990248e1f0d7fab550 Feb 17 14:52:16 crc kubenswrapper[4717]: E0217 14:52:16.390296 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="800ms" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.614751 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.615974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.616009 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.616019 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.616044 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:52:16 crc kubenswrapper[4717]: E0217 14:52:16.616694 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.74:6443: connect: connection refused" node="crc" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.782234 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.787229 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 11:43:54.88315409 +0000 UTC Feb 17 14:52:16 crc kubenswrapper[4717]: W0217 14:52:16.844609 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:16 crc kubenswrapper[4717]: E0217 14:52:16.844771 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.850791 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d2256ad57d1d9378fbd857e046de50c9d4d6fa7142de273c2973481686933b40"} Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.851944 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d86325926430910f9437b14fb839de697fd363cad479d894e449493c0df01014"} Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.853293 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c2bffc9459310fa3da96c3ae59f6128c3352e47efc0578990248e1f0d7fab550"} Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.854642 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6de7898b02a669b96bc50909c71d7acd3f5f3299f8295fea006ebfba9df12750"} Feb 17 14:52:16 crc kubenswrapper[4717]: I0217 14:52:16.855732 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9caf67262b025fcd6393cb81668b5f73bbaee2842aea2e1865783a41201753af"} Feb 17 14:52:17 crc kubenswrapper[4717]: E0217 14:52:17.191530 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="1.6s" Feb 17 14:52:17 crc kubenswrapper[4717]: W0217 14:52:17.287352 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:17 crc kubenswrapper[4717]: E0217 14:52:17.287426 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:17 crc kubenswrapper[4717]: W0217 14:52:17.307580 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:17 crc kubenswrapper[4717]: E0217 14:52:17.307699 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:17 crc kubenswrapper[4717]: W0217 14:52:17.383993 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:17 crc kubenswrapper[4717]: E0217 14:52:17.384156 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.417866 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.420726 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.420773 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.420786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.420823 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:52:17 crc kubenswrapper[4717]: E0217 14:52:17.421493 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.74:6443: connect: connection refused" node="crc" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.651227 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 14:52:17 crc kubenswrapper[4717]: E0217 14:52:17.652827 4717 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.782279 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.787368 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 05:44:50.838359927 +0000 UTC Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.860971 4717 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dc6bbb6613117300efa6c728ba584bd5bdeccefa8fc50d1fe9864e603ac360fc" exitCode=0 Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.861058 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dc6bbb6613117300efa6c728ba584bd5bdeccefa8fc50d1fe9864e603ac360fc"} Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.861195 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.862589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.862641 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.862659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.864778 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.864810 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757"} Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.865501 4717 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757" exitCode=0 Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.866421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.866479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.866504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.869549 4717 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5" exitCode=0 Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.869716 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5"} Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.869773 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.871355 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.871402 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.871423 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.874441 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899"} Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.874474 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874"} Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.874498 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b"} Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.874512 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236"} Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.874518 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.875708 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.875734 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.875748 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.877433 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e" exitCode=0 Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.877476 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e"} Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.877621 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.878937 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.878989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.878999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.884709 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.886345 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.886391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:17 crc kubenswrapper[4717]: I0217 14:52:17.886409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.654227 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.782736 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.787579 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 17:09:36.126990658 +0000 UTC Feb 17 14:52:18 crc kubenswrapper[4717]: E0217 14:52:18.793507 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="3.2s" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.884871 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bf1bef80b531a17e335240f260db84a0ed014c484239aa9d94d136ddf0188d6a"} Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.884938 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ff43e83143a6518fea4cb55cdc369649446cd2e15b3a776ac3df5f04f65a1744"} Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.884955 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9a40a2833a2d7bc1d9e94e0101eadfa1bc418f4143d686f41810b26566671bee"} Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.890028 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034"} Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.890130 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b"} Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.890154 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685"} Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.892845 4717 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="15888f573d471aca492e61355f53b2471cb3b6f73b5329357afc1329dcee5d4b" exitCode=0 Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.892931 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"15888f573d471aca492e61355f53b2471cb3b6f73b5329357afc1329dcee5d4b"} Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.893049 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.894585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.894639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.894657 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.895804 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"429e9be80a82dc2fb9996d26cdf005ecf8ac03a19141cac4a5928d632cfc1e19"} Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.895860 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.895920 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.897316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.897367 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.897383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.897439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.897468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:18 crc kubenswrapper[4717]: I0217 14:52:18.897482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:18 crc kubenswrapper[4717]: W0217 14:52:18.954493 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:18 crc kubenswrapper[4717]: E0217 14:52:18.954621 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.022146 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.023874 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.023917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.023931 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.023968 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:52:19 crc kubenswrapper[4717]: E0217 14:52:19.024617 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.74:6443: connect: connection refused" node="crc" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.782271 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.787705 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 14:36:08.445296375 +0000 UTC Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.848072 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.904580 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dd153def3c5940ab0c7f69f623f1c770ec8da64fda7ac1a6b5575612c074aa5b"} Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.904650 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034"} Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.904782 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.906414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.906467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.906487 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.907980 4717 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a59f8ffca58355293ded83f23c41b7528ce508ce893c67a9b0ad699013737cf2" exitCode=0 Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.908059 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a59f8ffca58355293ded83f23c41b7528ce508ce893c67a9b0ad699013737cf2"} Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.908123 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.908159 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.908268 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.908286 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.909547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.909589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.909604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.909760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.909806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.909829 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.910402 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.910461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.910485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.910520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.910541 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.910551 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.920547 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.920956 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Feb 17 14:52:19 crc kubenswrapper[4717]: I0217 14:52:19.921066 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" Feb 17 14:52:19 crc kubenswrapper[4717]: W0217 14:52:19.933799 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:19 crc kubenswrapper[4717]: E0217 14:52:19.933910 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:20 crc kubenswrapper[4717]: W0217 14:52:20.275475 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:20 crc kubenswrapper[4717]: E0217 14:52:20.275565 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:20 crc kubenswrapper[4717]: W0217 14:52:20.300818 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:20 crc kubenswrapper[4717]: E0217 14:52:20.300968 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.782177 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.788784 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 23:28:43.350748506 +0000 UTC Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.912640 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.914413 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dd153def3c5940ab0c7f69f623f1c770ec8da64fda7ac1a6b5575612c074aa5b" exitCode=255 Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.914470 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dd153def3c5940ab0c7f69f623f1c770ec8da64fda7ac1a6b5575612c074aa5b"} Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.914591 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.915521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.915549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.915560 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.916118 4717 scope.go:117] "RemoveContainer" containerID="dd153def3c5940ab0c7f69f623f1c770ec8da64fda7ac1a6b5575612c074aa5b" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.918844 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"00b9dbac5f1b401bd29895d158a6375875fde57062dd9a9e3d50b662a4fd8c19"} Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.918883 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a093154833c54fe11fcf8321bba7d1e5fd88b07259bc6012bce1eac5de58cfb1"} Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.918918 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.918964 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.919008 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.919861 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.919889 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.919900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.919962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.919998 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:20 crc kubenswrapper[4717]: I0217 14:52:20.920018 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.074191 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.789626 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 19:12:33.183276907 +0000 UTC Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.924396 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.927337 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23"} Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.927575 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.927739 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.928770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.928818 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.928836 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.934041 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7a24c45d759a685174f2796c6b4693de58dfdb082ccbc9b889857572694512ab"} Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.934097 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"849cfdcc83a8c65d74eaa79ec65157725c36f10f879a0e48aa3b9a9483886652"} Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.934114 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"96469e1b3a3e0c2a7a131f109884c7f6d1c4f72a8959c15f308ff276883687e4"} Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.934271 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.935759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.935794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.935803 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.989803 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 14:52:21 crc kubenswrapper[4717]: I0217 14:52:21.989970 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.225296 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.226859 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.226892 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.226902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.226932 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.332993 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.490929 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.491146 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.492513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.492568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.492589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.790581 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 05:20:38.650264609 +0000 UTC Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.848565 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.848736 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.937692 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.937741 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.937692 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.939105 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.939124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.939138 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.939141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.939149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:22 crc kubenswrapper[4717]: I0217 14:52:22.939151 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.736690 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.736895 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.738237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.738314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.738335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.791136 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 06:52:36.113169519 +0000 UTC Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.941336 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.941384 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.943231 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.943327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.943357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.943194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.943431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:23 crc kubenswrapper[4717]: I0217 14:52:23.943454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:24 crc kubenswrapper[4717]: I0217 14:52:24.791711 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 17:05:24.061151517 +0000 UTC Feb 17 14:52:25 crc kubenswrapper[4717]: I0217 14:52:25.791893 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 13:12:48.853350768 +0000 UTC Feb 17 14:52:25 crc kubenswrapper[4717]: E0217 14:52:25.920348 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 14:52:26 crc kubenswrapper[4717]: I0217 14:52:26.083730 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 17 14:52:26 crc kubenswrapper[4717]: I0217 14:52:26.083994 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:26 crc kubenswrapper[4717]: I0217 14:52:26.085777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:26 crc kubenswrapper[4717]: I0217 14:52:26.085823 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:26 crc kubenswrapper[4717]: I0217 14:52:26.085833 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:26 crc kubenswrapper[4717]: I0217 14:52:26.801124 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 14:59:00.872008654 +0000 UTC Feb 17 14:52:27 crc kubenswrapper[4717]: I0217 14:52:27.801705 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 06:59:03.254003413 +0000 UTC Feb 17 14:52:27 crc kubenswrapper[4717]: I0217 14:52:27.819152 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:27 crc kubenswrapper[4717]: I0217 14:52:27.819334 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:27 crc kubenswrapper[4717]: I0217 14:52:27.820908 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:27 crc kubenswrapper[4717]: I0217 14:52:27.820947 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:27 crc kubenswrapper[4717]: I0217 14:52:27.820963 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:27 crc kubenswrapper[4717]: I0217 14:52:27.838358 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:27 crc kubenswrapper[4717]: I0217 14:52:27.953581 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:27 crc kubenswrapper[4717]: I0217 14:52:27.955707 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:27 crc kubenswrapper[4717]: I0217 14:52:27.955758 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:27 crc kubenswrapper[4717]: I0217 14:52:27.955770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:27 crc kubenswrapper[4717]: I0217 14:52:27.962146 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:28 crc kubenswrapper[4717]: I0217 14:52:28.802419 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 09:19:02.494306383 +0000 UTC Feb 17 14:52:28 crc kubenswrapper[4717]: I0217 14:52:28.955569 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:28 crc kubenswrapper[4717]: I0217 14:52:28.957013 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:28 crc kubenswrapper[4717]: I0217 14:52:28.957122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:28 crc kubenswrapper[4717]: I0217 14:52:28.957187 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:29 crc kubenswrapper[4717]: I0217 14:52:29.803409 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 10:25:15.17570064 +0000 UTC Feb 17 14:52:30 crc kubenswrapper[4717]: I0217 14:52:30.803904 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 13:30:35.98755025 +0000 UTC Feb 17 14:52:31 crc kubenswrapper[4717]: I0217 14:52:31.075393 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 14:52:31 crc kubenswrapper[4717]: I0217 14:52:31.075504 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 14:52:31 crc kubenswrapper[4717]: I0217 14:52:31.783639 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 17 14:52:31 crc kubenswrapper[4717]: I0217 14:52:31.804946 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 17:55:20.603903755 +0000 UTC Feb 17 14:52:31 crc kubenswrapper[4717]: E0217 14:52:31.991388 4717 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 17 14:52:31 crc kubenswrapper[4717]: E0217 14:52:31.995788 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.029945 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.030198 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.031702 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.031728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.031735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.063527 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.133849 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.133945 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.139897 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.139965 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.805309 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 12:54:33.449121392 +0000 UTC Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.849706 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.849801 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.966490 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.968054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.968109 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:32 crc kubenswrapper[4717]: I0217 14:52:32.968120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:33 crc kubenswrapper[4717]: I0217 14:52:33.805663 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 10:56:57.861905411 +0000 UTC Feb 17 14:52:34 crc kubenswrapper[4717]: I0217 14:52:34.806724 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 15:26:24.66015865 +0000 UTC Feb 17 14:52:34 crc kubenswrapper[4717]: I0217 14:52:34.925965 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:34 crc kubenswrapper[4717]: I0217 14:52:34.926221 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:34 crc kubenswrapper[4717]: I0217 14:52:34.928062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:34 crc kubenswrapper[4717]: I0217 14:52:34.928220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:34 crc kubenswrapper[4717]: I0217 14:52:34.928306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:34 crc kubenswrapper[4717]: I0217 14:52:34.931754 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:34 crc kubenswrapper[4717]: I0217 14:52:34.972505 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:34 crc kubenswrapper[4717]: I0217 14:52:34.973589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:34 crc kubenswrapper[4717]: I0217 14:52:34.973608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:34 crc kubenswrapper[4717]: I0217 14:52:34.973622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:35 crc kubenswrapper[4717]: I0217 14:52:35.807693 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 17:49:10.329847483 +0000 UTC Feb 17 14:52:35 crc kubenswrapper[4717]: E0217 14:52:35.920500 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 14:52:36 crc kubenswrapper[4717]: I0217 14:52:36.808802 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 03:26:58.343825487 +0000 UTC Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.138857 4717 trace.go:236] Trace[263912011]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 14:52:24.368) (total time: 12770ms): Feb 17 14:52:37 crc kubenswrapper[4717]: Trace[263912011]: ---"Objects listed" error: 12770ms (14:52:37.138) Feb 17 14:52:37 crc kubenswrapper[4717]: Trace[263912011]: [12.770147075s] [12.770147075s] END Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.138921 4717 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.138945 4717 trace.go:236] Trace[792154182]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 14:52:25.748) (total time: 11389ms): Feb 17 14:52:37 crc kubenswrapper[4717]: Trace[792154182]: ---"Objects listed" error: 11389ms (14:52:37.138) Feb 17 14:52:37 crc kubenswrapper[4717]: Trace[792154182]: [11.389869585s] [11.389869585s] END Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.138981 4717 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.140934 4717 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.140973 4717 trace.go:236] Trace[1466837364]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 14:52:24.184) (total time: 12956ms): Feb 17 14:52:37 crc kubenswrapper[4717]: Trace[1466837364]: ---"Objects listed" error: 12956ms (14:52:37.140) Feb 17 14:52:37 crc kubenswrapper[4717]: Trace[1466837364]: [12.956219887s] [12.956219887s] END Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.140999 4717 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.142429 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.145383 4717 trace.go:236] Trace[2074103826]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 14:52:25.245) (total time: 11899ms): Feb 17 14:52:37 crc kubenswrapper[4717]: Trace[2074103826]: ---"Objects listed" error: 11899ms (14:52:37.145) Feb 17 14:52:37 crc kubenswrapper[4717]: Trace[2074103826]: [11.899482011s] [11.899482011s] END Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.145422 4717 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.168565 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46536->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.168636 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46536->192.168.126.11:17697: read: connection reset by peer" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.168956 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.168983 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.778770 4717 apiserver.go:52] "Watching apiserver" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.797844 4717 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.798135 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.798465 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.798626 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.798694 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.798832 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.798917 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.798911 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.798973 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.799115 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.799303 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.802420 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.802488 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.802648 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.802878 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.802965 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.803042 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.803144 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.803260 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.803265 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.809603 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 08:29:59.638509374 +0000 UTC Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.851821 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.870743 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.881586 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.887481 4717 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.891540 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.904750 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.915003 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.928442 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.940294 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946552 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946599 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946621 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946647 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946664 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946680 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946856 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946908 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946933 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946949 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946970 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946996 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947018 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947036 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947057 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947075 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947112 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947137 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947206 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947225 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947249 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947270 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947289 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947306 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947325 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947346 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947364 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947384 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947404 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947423 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947444 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947466 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947488 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947506 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947531 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947549 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947565 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947580 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947597 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947615 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947635 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947652 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947671 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947690 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947706 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947723 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947739 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947759 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947779 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947798 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947816 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947834 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947853 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947869 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947887 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947905 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947923 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947942 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947965 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.946981 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947026 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947233 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947245 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947817 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948130 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947473 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947635 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947650 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947647 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947872 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947917 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947978 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.947990 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948285 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948304 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948336 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948339 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948359 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948379 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948381 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948385 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948426 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948423 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948450 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948468 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948486 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948503 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948520 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948536 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948551 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948570 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948574 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948588 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948604 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948621 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948635 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948651 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948667 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948685 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948700 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948719 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948733 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948734 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948750 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948768 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948785 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948801 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948818 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948838 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948855 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948871 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948887 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948903 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948918 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948934 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948957 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948982 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949002 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949023 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949039 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949054 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949071 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949107 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949124 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949144 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949160 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949175 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949190 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949206 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949221 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949238 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949255 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949276 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949301 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949325 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949345 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949368 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949391 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949414 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949436 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949456 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949472 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949488 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949504 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949522 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949538 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949553 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949570 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949593 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949621 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949644 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949668 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949690 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949712 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949734 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949750 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949766 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949782 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949798 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949817 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949836 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949854 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949870 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949887 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949904 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949921 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949939 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949956 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949977 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949999 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950020 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950037 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950061 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950112 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950129 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950144 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950164 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950180 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950196 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950213 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950229 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950247 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950265 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950281 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950297 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950314 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950330 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950346 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950363 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950379 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950401 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950504 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950531 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950554 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950574 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950598 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950646 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950666 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950682 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950716 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950735 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950753 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950771 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950788 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950804 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950821 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950838 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950854 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950871 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950887 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950908 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950932 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950953 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950973 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950999 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951046 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951094 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951134 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951155 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951173 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951193 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951213 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951234 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951254 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951272 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951293 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951313 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951330 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951349 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951419 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951434 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951447 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951457 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951469 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951481 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951494 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951508 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951522 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951534 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951543 4717 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951552 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951565 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951576 4717 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951590 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951602 4717 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951613 4717 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951626 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951637 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951649 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.952539 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.961182 4717 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948764 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948870 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948945 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.948976 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949007 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949021 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949238 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949276 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949293 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949296 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949349 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949433 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949727 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.969397 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.969433 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.969872 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.970288 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.970444 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.970704 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.970740 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949764 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949821 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.949944 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950020 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950204 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950440 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950565 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950580 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950684 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950706 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.950999 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951009 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951175 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951355 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951396 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951496 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951758 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.951894 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.952041 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.952049 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.952260 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.952607 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.952688 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.952984 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.953028 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.953035 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.953353 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.953426 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.953506 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.953524 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.953534 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.953580 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.953792 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.954070 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.954133 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.954218 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.954264 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.954296 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.954645 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.954684 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.954717 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.954777 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.955401 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.955508 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.955523 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.955913 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.956064 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.956241 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.956435 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.956644 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.956661 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.956745 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.956893 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.957195 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.957249 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.957259 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.957664 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.971243 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.958226 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:52:38.458205291 +0000 UTC m=+24.874045767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959150 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959167 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959266 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959287 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959289 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959318 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959513 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959563 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959611 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959615 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959734 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959944 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.959892 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.960232 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.960251 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.960269 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.960489 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.960870 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.961152 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.961169 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.961211 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.961288 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.961391 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.961636 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.961857 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.963023 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.963120 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.963895 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.964038 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.964221 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.964571 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.964845 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.961626 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.966295 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.966349 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.966565 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.966636 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.967051 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.967149 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.967412 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.967480 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.967526 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.967567 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.967526 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.967643 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.967907 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.967941 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.968144 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.968241 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.968323 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.968484 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.968579 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.968580 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.968640 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.968824 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.968883 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.969073 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.971637 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.972423 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:38.472229776 +0000 UTC m=+24.888070252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.975199 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.975228 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.975244 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.975311 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.975331 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.975344 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.975319 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:38.475298848 +0000 UTC m=+24.891139404 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.975396 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:38.47538427 +0000 UTC m=+24.891224746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:37 crc kubenswrapper[4717]: E0217 14:52:37.975482 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:38.475473492 +0000 UTC m=+24.891314058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.975886 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.975970 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.976036 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.976119 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.976293 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.976290 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.976310 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.976448 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.976513 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.976576 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.976649 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.976924 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.977457 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.977572 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.978214 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.978536 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.980996 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.981103 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.981203 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.981305 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.981425 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.981484 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.981545 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.981710 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.981281 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.982103 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.982325 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.983209 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.983909 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.983946 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.983969 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.984385 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.984434 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.984523 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.984535 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.985282 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.988416 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.988517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.990584 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.990820 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23" exitCode=255 Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.990859 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23"} Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.990909 4717 scope.go:117] "RemoveContainer" containerID="dd153def3c5940ab0c7f69f623f1c770ec8da64fda7ac1a6b5575612c074aa5b" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.993306 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.993666 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:52:37 crc kubenswrapper[4717]: I0217 14:52:37.999480 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.002577 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.005975 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.006259 4717 scope.go:117] "RemoveContainer" containerID="2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23" Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.006572 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.006851 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.007299 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.007411 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.009002 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.009263 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.015857 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.026368 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.026395 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.029718 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.035595 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.044160 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.052866 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.052915 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053061 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053115 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053102 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053192 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053209 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053221 4717 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053233 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053244 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053254 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053267 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053276 4717 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053287 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053297 4717 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053308 4717 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053317 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053327 4717 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053337 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053346 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053355 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053364 4717 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053373 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053381 4717 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053390 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053358 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053401 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053467 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053482 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053493 4717 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053505 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053515 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053523 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053532 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053541 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053550 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053558 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053567 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053576 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053585 4717 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053594 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053603 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053612 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053636 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053645 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053654 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053665 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053673 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053681 4717 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053689 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053697 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053705 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053714 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053723 4717 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053733 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053745 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053754 4717 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053762 4717 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053772 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053780 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053789 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053799 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053808 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053818 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053826 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053835 4717 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053845 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053853 4717 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053862 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053870 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053878 4717 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053888 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053896 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053904 4717 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053912 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053921 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053929 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053938 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053946 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053955 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053963 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053972 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053981 4717 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053989 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.053997 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054005 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054014 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054023 4717 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054032 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054040 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054051 4717 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054060 4717 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054067 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054077 4717 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054111 4717 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054120 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054129 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054137 4717 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054144 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054153 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054161 4717 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054169 4717 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054177 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054186 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054194 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054203 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054212 4717 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054220 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054229 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054237 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054246 4717 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054256 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054264 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054273 4717 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054280 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054289 4717 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054297 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054305 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054313 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054321 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054329 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054338 4717 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054347 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054355 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054364 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054373 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054381 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054390 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054400 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054413 4717 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054429 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054439 4717 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054449 4717 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054459 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054468 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054481 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054492 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054501 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054512 4717 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054524 4717 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054535 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054545 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054553 4717 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054562 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054570 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054579 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054587 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054595 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054604 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054612 4717 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054620 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054629 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054637 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054648 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054657 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054667 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054676 4717 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054685 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054693 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054702 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054710 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054718 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054726 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054734 4717 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054742 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054751 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054760 4717 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054774 4717 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054781 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054790 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054799 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054807 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054816 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054824 4717 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054832 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054841 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054851 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054859 4717 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.054867 4717 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.110568 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:52:38 crc kubenswrapper[4717]: W0217 14:52:38.121732 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-0227836d7ad6e5384ab4573d4b6b80889fd01c85bb4c466874bf06e327bd9015 WatchSource:0}: Error finding container 0227836d7ad6e5384ab4573d4b6b80889fd01c85bb4c466874bf06e327bd9015: Status 404 returned error can't find the container with id 0227836d7ad6e5384ab4573d4b6b80889fd01c85bb4c466874bf06e327bd9015 Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.123577 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.130159 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:52:38 crc kubenswrapper[4717]: W0217 14:52:38.135818 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-922cb3f66bb7f866d157d3515767dddc827151d4fe39617c41fb11871dd6d47c WatchSource:0}: Error finding container 922cb3f66bb7f866d157d3515767dddc827151d4fe39617c41fb11871dd6d47c: Status 404 returned error can't find the container with id 922cb3f66bb7f866d157d3515767dddc827151d4fe39617c41fb11871dd6d47c Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.458794 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.459271 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:52:39.459217168 +0000 UTC m=+25.875057744 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.559844 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.559893 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.559916 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.559937 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.560013 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.560068 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:39.560054123 +0000 UTC m=+25.975894599 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.560012 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.560134 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:39.560124935 +0000 UTC m=+25.975965411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.560230 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.560288 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.560304 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.560306 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.560362 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.560386 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.560407 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:39.560371071 +0000 UTC m=+25.976211697 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:38 crc kubenswrapper[4717]: E0217 14:52:38.560479 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:39.560447513 +0000 UTC m=+25.976288159 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.809743 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 03:40:01.535323109 +0000 UTC Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.994772 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"255a60a0bc3d4ea990a07073cd72d6455f7635c4309e4217e05213902152da2e"} Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.996864 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5"} Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.996937 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f"} Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.996954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"922cb3f66bb7f866d157d3515767dddc827151d4fe39617c41fb11871dd6d47c"} Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.998550 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453"} Feb 17 14:52:38 crc kubenswrapper[4717]: I0217 14:52:38.998588 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0227836d7ad6e5384ab4573d4b6b80889fd01c85bb4c466874bf06e327bd9015"} Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.000333 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.002399 4717 scope.go:117] "RemoveContainer" containerID="2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23" Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.002565 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.036071 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.049206 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.061969 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.076936 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd153def3c5940ab0c7f69f623f1c770ec8da64fda7ac1a6b5575612c074aa5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:20Z\\\",\\\"message\\\":\\\"W0217 14:52:20.167963 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 14:52:20.168675 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771339940 cert, and key in /tmp/serving-cert-3123395212/serving-signer.crt, /tmp/serving-cert-3123395212/serving-signer.key\\\\nI0217 14:52:20.568545 1 observer_polling.go:159] Starting file observer\\\\nW0217 14:52:20.572659 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 14:52:20.572901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:20.574069 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3123395212/tls.crt::/tmp/serving-cert-3123395212/tls.key\\\\\\\"\\\\nF0217 14:52:20.798279 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.089550 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.104103 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.115348 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.128064 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.142054 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.171379 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.183298 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.194599 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.206189 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.216018 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.469731 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.469980 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:52:41.469932465 +0000 UTC m=+27.885772931 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.570994 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.571054 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.571132 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.571165 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.571350 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.571357 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.571436 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.571481 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.571375 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.571497 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.571509 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:41.571465528 +0000 UTC m=+27.987306164 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.571510 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.571566 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:41.571544721 +0000 UTC m=+27.987385217 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.571623 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:41.571610432 +0000 UTC m=+27.987451088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.571667 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.571720 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:41.571709515 +0000 UTC m=+27.987550181 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.707819 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-xr7pz"] Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.708301 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-dtt4m"] Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.708564 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xr7pz" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.708755 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.709266 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4n7g7"] Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.710883 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-nfvrt"] Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.711298 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.711758 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.715286 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4f7wr"] Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.715510 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.715630 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.715729 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.715737 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.715886 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.715964 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.718884 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.719040 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.719493 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.719666 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.719793 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.719894 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.720025 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.720246 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.720418 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.720527 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.725416 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.725713 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.725726 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.725814 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.725922 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.725977 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.726170 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.735324 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.757486 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.772688 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.794683 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.806832 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.810821 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 15:15:54.094928452 +0000 UTC Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.827640 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.848377 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.848510 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.848718 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.848526 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.848510 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:39 crc kubenswrapper[4717]: E0217 14:52:39.848850 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.850495 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.851209 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.852982 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.853868 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.855264 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.855909 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.856805 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.857988 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.858353 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.859271 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.860592 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.861216 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.863579 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.864398 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.865027 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.866210 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.866761 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.867788 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.868211 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.868792 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.869902 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.870373 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.874413 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.874629 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-var-lib-cni-multus\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.874671 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-multus-conf-dir\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.874691 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-multus-socket-dir-parent\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.874709 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-run-netns\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.874725 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-var-lib-kubelet\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.874744 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-run-netns\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.874859 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-systemd-units\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.874951 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovnkube-config\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875013 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9-mcd-auth-proxy-config\") pod \"machine-config-daemon-dtt4m\" (UID: \"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\") " pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875053 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875133 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-system-cni-dir\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875167 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-node-log\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875226 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2khb\" (UniqueName: \"kubernetes.io/projected/3daa865c-6e58-4512-9be1-5d3a490a2f7a-kube-api-access-t2khb\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875262 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-openvswitch\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875504 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-system-cni-dir\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875504 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875552 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-etc-openvswitch\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875631 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-os-release\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875689 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3daa865c-6e58-4512-9be1-5d3a490a2f7a-cni-binary-copy\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875727 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-run-multus-certs\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875780 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-kubelet\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875854 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-cni-netd\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875951 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkszq\" (UniqueName: \"kubernetes.io/projected/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-kube-api-access-gkszq\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.875992 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-cnibin\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876033 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-cni-binary-copy\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876066 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xwkz\" (UniqueName: \"kubernetes.io/projected/3306d20c-0893-4cc3-b35d-41b6365c7aaa-kube-api-access-5xwkz\") pod \"node-resolver-xr7pz\" (UID: \"3306d20c-0893-4cc3-b35d-41b6365c7aaa\") " pod="openshift-dns/node-resolver-xr7pz" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876135 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-var-lib-cni-bin\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876162 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-systemd\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876213 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-log-socket\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876299 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-etc-kubernetes\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876346 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h5lq\" (UniqueName: \"kubernetes.io/projected/7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9-kube-api-access-2h5lq\") pod \"machine-config-daemon-dtt4m\" (UID: \"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\") " pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876419 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-multus-cni-dir\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876481 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3daa865c-6e58-4512-9be1-5d3a490a2f7a-multus-daemon-config\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876517 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-cnibin\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876571 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876600 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-run-k8s-cni-cncf-io\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876633 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-var-lib-openvswitch\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876683 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-env-overrides\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876731 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmrwj\" (UniqueName: \"kubernetes.io/projected/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-kube-api-access-dmrwj\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876786 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9-rootfs\") pod \"machine-config-daemon-dtt4m\" (UID: \"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\") " pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876819 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876898 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-ovn\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.876983 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-os-release\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.877019 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-hostroot\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.877135 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovn-node-metrics-cert\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.877169 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3306d20c-0893-4cc3-b35d-41b6365c7aaa-hosts-file\") pod \"node-resolver-xr7pz\" (UID: \"3306d20c-0893-4cc3-b35d-41b6365c7aaa\") " pod="openshift-dns/node-resolver-xr7pz" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.877250 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-slash\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.877403 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-cni-bin\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.877452 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.877476 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovnkube-script-lib\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.877497 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.877521 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9-proxy-tls\") pod \"machine-config-daemon-dtt4m\" (UID: \"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\") " pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.877534 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.877538 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-run-ovn-kubernetes\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.878382 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.879721 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.880252 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.881535 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.882052 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.882965 4717 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.883098 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.885039 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.886241 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.886784 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.888502 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.889181 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.890219 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.890963 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.892140 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.892729 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.892921 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.894031 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.895015 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.896304 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.896796 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.897807 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.898433 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.899732 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.900296 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.901418 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.902045 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.903140 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.903866 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.904502 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.905813 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.918317 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.918414 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.925267 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.950217 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978344 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-ovn\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978406 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978441 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-os-release\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978468 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-hostroot\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978497 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovn-node-metrics-cert\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978527 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3306d20c-0893-4cc3-b35d-41b6365c7aaa-hosts-file\") pod \"node-resolver-xr7pz\" (UID: \"3306d20c-0893-4cc3-b35d-41b6365c7aaa\") " pod="openshift-dns/node-resolver-xr7pz" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978552 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-slash\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978576 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-run-ovn-kubernetes\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978605 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-cni-bin\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978632 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978658 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovnkube-script-lib\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978684 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978708 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9-proxy-tls\") pod \"machine-config-daemon-dtt4m\" (UID: \"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\") " pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978731 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-var-lib-cni-multus\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978753 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-multus-conf-dir\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978778 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-multus-socket-dir-parent\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978799 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-run-netns\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978825 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-var-lib-kubelet\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978847 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-run-netns\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978868 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-systemd-units\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978890 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovnkube-config\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978911 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9-mcd-auth-proxy-config\") pod \"machine-config-daemon-dtt4m\" (UID: \"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\") " pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978946 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-system-cni-dir\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.978969 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-node-log\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979005 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2khb\" (UniqueName: \"kubernetes.io/projected/3daa865c-6e58-4512-9be1-5d3a490a2f7a-kube-api-access-t2khb\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979027 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-openvswitch\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979049 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-system-cni-dir\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979102 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-etc-openvswitch\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979130 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-os-release\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979153 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-cni-binary-copy\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979175 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3daa865c-6e58-4512-9be1-5d3a490a2f7a-cni-binary-copy\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979197 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-run-multus-certs\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979220 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-kubelet\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979243 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-cni-netd\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979265 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkszq\" (UniqueName: \"kubernetes.io/projected/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-kube-api-access-gkszq\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979289 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-cnibin\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979310 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xwkz\" (UniqueName: \"kubernetes.io/projected/3306d20c-0893-4cc3-b35d-41b6365c7aaa-kube-api-access-5xwkz\") pod \"node-resolver-xr7pz\" (UID: \"3306d20c-0893-4cc3-b35d-41b6365c7aaa\") " pod="openshift-dns/node-resolver-xr7pz" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979341 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-var-lib-cni-bin\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979361 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-systemd\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979397 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-log-socket\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979426 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-etc-kubernetes\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979447 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h5lq\" (UniqueName: \"kubernetes.io/projected/7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9-kube-api-access-2h5lq\") pod \"machine-config-daemon-dtt4m\" (UID: \"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\") " pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979467 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-multus-cni-dir\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979486 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3daa865c-6e58-4512-9be1-5d3a490a2f7a-multus-daemon-config\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979505 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-cnibin\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979525 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-run-k8s-cni-cncf-io\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979545 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-var-lib-openvswitch\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979566 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-env-overrides\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979587 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmrwj\" (UniqueName: \"kubernetes.io/projected/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-kube-api-access-dmrwj\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979609 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9-rootfs\") pod \"machine-config-daemon-dtt4m\" (UID: \"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\") " pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979689 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9-rootfs\") pod \"machine-config-daemon-dtt4m\" (UID: \"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\") " pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.979739 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-ovn\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.980443 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.980683 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-os-release\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.980736 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-system-cni-dir\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.980734 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-hostroot\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.980825 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-etc-openvswitch\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.980895 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-os-release\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981232 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3306d20c-0893-4cc3-b35d-41b6365c7aaa-hosts-file\") pod \"node-resolver-xr7pz\" (UID: \"3306d20c-0893-4cc3-b35d-41b6365c7aaa\") " pod="openshift-dns/node-resolver-xr7pz" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981311 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-slash\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981349 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-run-ovn-kubernetes\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981375 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-cni-bin\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981406 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981422 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-log-socket\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981471 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-run-multus-certs\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981506 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-kubelet\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981537 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-cni-netd\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981552 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-cni-binary-copy\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981663 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-cnibin\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981715 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-etc-kubernetes\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981862 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-cnibin\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.981989 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-var-lib-cni-bin\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.982029 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-systemd\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.982062 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-run-netns\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.982095 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3daa865c-6e58-4512-9be1-5d3a490a2f7a-cni-binary-copy\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.982203 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-multus-cni-dir\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.982219 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovnkube-script-lib\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.982260 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-run-netns\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.982291 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-system-cni-dir\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.982621 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-env-overrides\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.982622 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-run-k8s-cni-cncf-io\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.982633 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-node-log\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.982663 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-systemd-units\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.982645 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-var-lib-kubelet\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.983173 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9-mcd-auth-proxy-config\") pod \"machine-config-daemon-dtt4m\" (UID: \"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\") " pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.983207 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-openvswitch\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.983220 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3daa865c-6e58-4512-9be1-5d3a490a2f7a-multus-daemon-config\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.983255 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-var-lib-openvswitch\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.983415 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovnkube-config\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.983442 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-multus-conf-dir\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.983423 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-host-var-lib-cni-multus\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.983617 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3daa865c-6e58-4512-9be1-5d3a490a2f7a-multus-socket-dir-parent\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.983849 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.988038 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovn-node-metrics-cert\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.991236 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9-proxy-tls\") pod \"machine-config-daemon-dtt4m\" (UID: \"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\") " pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:39 crc kubenswrapper[4717]: I0217 14:52:39.991181 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.001307 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h5lq\" (UniqueName: \"kubernetes.io/projected/7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9-kube-api-access-2h5lq\") pod \"machine-config-daemon-dtt4m\" (UID: \"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\") " pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.003334 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xwkz\" (UniqueName: \"kubernetes.io/projected/3306d20c-0893-4cc3-b35d-41b6365c7aaa-kube-api-access-5xwkz\") pod \"node-resolver-xr7pz\" (UID: \"3306d20c-0893-4cc3-b35d-41b6365c7aaa\") " pod="openshift-dns/node-resolver-xr7pz" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.004279 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2khb\" (UniqueName: \"kubernetes.io/projected/3daa865c-6e58-4512-9be1-5d3a490a2f7a-kube-api-access-t2khb\") pod \"multus-nfvrt\" (UID: \"3daa865c-6e58-4512-9be1-5d3a490a2f7a\") " pod="openshift-multus/multus-nfvrt" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.008910 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkszq\" (UniqueName: \"kubernetes.io/projected/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-kube-api-access-gkszq\") pod \"ovnkube-node-4f7wr\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.013693 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmrwj\" (UniqueName: \"kubernetes.io/projected/db1be0ab-a28a-4d7a-b871-d9fc8dea5841-kube-api-access-dmrwj\") pod \"multus-additional-cni-plugins-4n7g7\" (UID: \"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\") " pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.020626 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.029670 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.036121 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xr7pz" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.042826 4717 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.045310 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.047354 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:52:40 crc kubenswrapper[4717]: W0217 14:52:40.047614 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3306d20c_0893_4cc3_b35d_41b6365c7aaa.slice/crio-9691a4a2777e1f31140add181e0f67adb33c56cfb48d83b4f196d218ef930503 WatchSource:0}: Error finding container 9691a4a2777e1f31140add181e0f67adb33c56cfb48d83b4f196d218ef930503: Status 404 returned error can't find the container with id 9691a4a2777e1f31140add181e0f67adb33c56cfb48d83b4f196d218ef930503 Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.058185 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nfvrt" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.071241 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.072278 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.076744 4717 csr.go:261] certificate signing request csr-n5n86 is approved, waiting to be issued Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.085002 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.088582 4717 csr.go:257] certificate signing request csr-n5n86 is issued Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.098516 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.164448 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.181191 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.205962 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.236443 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.255069 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.277761 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.297753 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.319184 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.337216 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.368304 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.406707 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.426669 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.447380 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.460966 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.483437 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.499514 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.514008 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:40 crc kubenswrapper[4717]: I0217 14:52:40.811165 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 11:49:59.566153729 +0000 UTC Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.009157 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf"} Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.010918 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nfvrt" event={"ID":"3daa865c-6e58-4512-9be1-5d3a490a2f7a","Type":"ContainerStarted","Data":"fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557"} Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.010966 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nfvrt" event={"ID":"3daa865c-6e58-4512-9be1-5d3a490a2f7a","Type":"ContainerStarted","Data":"9baf5756cbd7ec5bc208b064d59274040d12e5e6bea1b3a7246805566b9c9a37"} Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.012773 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea"} Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.012973 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079"} Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.013115 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"f7f6aa0695ad627922cbc98bbdb34daf624dcac9796b926059d4e3754f7fe435"} Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.014179 4717 generic.go:334] "Generic (PLEG): container finished" podID="db1be0ab-a28a-4d7a-b871-d9fc8dea5841" containerID="58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9" exitCode=0 Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.014263 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" event={"ID":"db1be0ab-a28a-4d7a-b871-d9fc8dea5841","Type":"ContainerDied","Data":"58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9"} Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.014309 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" event={"ID":"db1be0ab-a28a-4d7a-b871-d9fc8dea5841","Type":"ContainerStarted","Data":"af9ef82fafe619044f06966c32f42f30b953cd8813e66eaf665d256af9f24be5"} Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.015759 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b" exitCode=0 Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.015834 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b"} Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.015871 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"26b1f66774dfdd2c7653e701d74aaffe333905b155324c182a3a7ca8c39433e8"} Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.017874 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xr7pz" event={"ID":"3306d20c-0893-4cc3-b35d-41b6365c7aaa","Type":"ContainerStarted","Data":"a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f"} Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.017906 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xr7pz" event={"ID":"3306d20c-0893-4cc3-b35d-41b6365c7aaa","Type":"ContainerStarted","Data":"9691a4a2777e1f31140add181e0f67adb33c56cfb48d83b4f196d218ef930503"} Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.028595 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.047882 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.070258 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.074721 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.077897 4717 scope.go:117] "RemoveContainer" containerID="2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23" Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.078122 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.085432 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.089364 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-17 14:47:40 +0000 UTC, rotation deadline is 2026-12-13 19:05:08.762654606 +0000 UTC Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.089447 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7180h12m27.673211187s for next certificate rotation Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.099242 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.112928 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.126881 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.143655 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.159590 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.174223 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.204105 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.221791 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.239676 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.254347 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.272405 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.290546 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.303465 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.317194 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.331834 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.345761 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.360247 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.377275 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.393238 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.419530 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.440420 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.454822 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.496498 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.496748 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:52:45.496711521 +0000 UTC m=+31.912552037 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.597654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.597712 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.597748 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.597776 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.597839 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.597911 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.598037 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.598052 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.598067 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.598168 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.598183 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.598193 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.597916 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:45.597899045 +0000 UTC m=+32.013739511 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.598243 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:45.598223314 +0000 UTC m=+32.014063810 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.598261 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:45.598252635 +0000 UTC m=+32.014093121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.598284 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:45.598269985 +0000 UTC m=+32.014110471 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.811552 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:57:19.884981862 +0000 UTC Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.846114 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.846115 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.846286 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:52:41 crc kubenswrapper[4717]: I0217 14:52:41.846136 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.846469 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:52:41 crc kubenswrapper[4717]: E0217 14:52:41.846399 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.022575 4717 generic.go:334] "Generic (PLEG): container finished" podID="db1be0ab-a28a-4d7a-b871-d9fc8dea5841" containerID="461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653" exitCode=0 Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.022682 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" event={"ID":"db1be0ab-a28a-4d7a-b871-d9fc8dea5841","Type":"ContainerDied","Data":"461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653"} Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.025597 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305"} Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.025708 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555"} Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.025782 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06"} Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.025851 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a"} Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.025908 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce"} Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.041795 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.061593 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.077643 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.090027 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.110028 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.125889 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.141560 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.160249 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.201637 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.233846 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.259798 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.292380 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.309145 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.361975 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4phxm"] Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.362504 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4phxm" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.365353 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.365576 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.365659 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.365749 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.376665 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.388374 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.406225 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.420550 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.478165 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.491702 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.506929 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.509340 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dnz8\" (UniqueName: \"kubernetes.io/projected/64268f67-a8ad-41d1-a94a-13d926ad6022-kube-api-access-5dnz8\") pod \"node-ca-4phxm\" (UID: \"64268f67-a8ad-41d1-a94a-13d926ad6022\") " pod="openshift-image-registry/node-ca-4phxm" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.509379 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64268f67-a8ad-41d1-a94a-13d926ad6022-host\") pod \"node-ca-4phxm\" (UID: \"64268f67-a8ad-41d1-a94a-13d926ad6022\") " pod="openshift-image-registry/node-ca-4phxm" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.509425 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/64268f67-a8ad-41d1-a94a-13d926ad6022-serviceca\") pod \"node-ca-4phxm\" (UID: \"64268f67-a8ad-41d1-a94a-13d926ad6022\") " pod="openshift-image-registry/node-ca-4phxm" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.522394 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.537593 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.551376 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.566853 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.583373 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.594152 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.609058 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.610412 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/64268f67-a8ad-41d1-a94a-13d926ad6022-serviceca\") pod \"node-ca-4phxm\" (UID: \"64268f67-a8ad-41d1-a94a-13d926ad6022\") " pod="openshift-image-registry/node-ca-4phxm" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.610482 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dnz8\" (UniqueName: \"kubernetes.io/projected/64268f67-a8ad-41d1-a94a-13d926ad6022-kube-api-access-5dnz8\") pod \"node-ca-4phxm\" (UID: \"64268f67-a8ad-41d1-a94a-13d926ad6022\") " pod="openshift-image-registry/node-ca-4phxm" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.610534 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64268f67-a8ad-41d1-a94a-13d926ad6022-host\") pod \"node-ca-4phxm\" (UID: \"64268f67-a8ad-41d1-a94a-13d926ad6022\") " pod="openshift-image-registry/node-ca-4phxm" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.610615 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64268f67-a8ad-41d1-a94a-13d926ad6022-host\") pod \"node-ca-4phxm\" (UID: \"64268f67-a8ad-41d1-a94a-13d926ad6022\") " pod="openshift-image-registry/node-ca-4phxm" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.611876 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/64268f67-a8ad-41d1-a94a-13d926ad6022-serviceca\") pod \"node-ca-4phxm\" (UID: \"64268f67-a8ad-41d1-a94a-13d926ad6022\") " pod="openshift-image-registry/node-ca-4phxm" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.627973 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dnz8\" (UniqueName: \"kubernetes.io/projected/64268f67-a8ad-41d1-a94a-13d926ad6022-kube-api-access-5dnz8\") pod \"node-ca-4phxm\" (UID: \"64268f67-a8ad-41d1-a94a-13d926ad6022\") " pod="openshift-image-registry/node-ca-4phxm" Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.788937 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4phxm" Feb 17 14:52:42 crc kubenswrapper[4717]: W0217 14:52:42.802529 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64268f67_a8ad_41d1_a94a_13d926ad6022.slice/crio-55ac9d5358eb21ba08f32212132f8efef801cb420890fb627e981b83597c8693 WatchSource:0}: Error finding container 55ac9d5358eb21ba08f32212132f8efef801cb420890fb627e981b83597c8693: Status 404 returned error can't find the container with id 55ac9d5358eb21ba08f32212132f8efef801cb420890fb627e981b83597c8693 Feb 17 14:52:42 crc kubenswrapper[4717]: I0217 14:52:42.811926 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 01:59:26.55271032 +0000 UTC Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.032798 4717 generic.go:334] "Generic (PLEG): container finished" podID="db1be0ab-a28a-4d7a-b871-d9fc8dea5841" containerID="be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516" exitCode=0 Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.032899 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" event={"ID":"db1be0ab-a28a-4d7a-b871-d9fc8dea5841","Type":"ContainerDied","Data":"be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516"} Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.040958 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13"} Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.042503 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4phxm" event={"ID":"64268f67-a8ad-41d1-a94a-13d926ad6022","Type":"ContainerStarted","Data":"8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1"} Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.042533 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4phxm" event={"ID":"64268f67-a8ad-41d1-a94a-13d926ad6022","Type":"ContainerStarted","Data":"55ac9d5358eb21ba08f32212132f8efef801cb420890fb627e981b83597c8693"} Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.051271 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.063991 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.074968 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.094354 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.106296 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.125350 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.147408 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.164650 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.174128 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.192292 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.208735 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.227698 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.239270 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.254405 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.543337 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.545667 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.545725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.545741 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.546523 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.557428 4717 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.557663 4717 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.558922 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.558950 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.558959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.558976 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.558987 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:43Z","lastTransitionTime":"2026-02-17T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:43 crc kubenswrapper[4717]: E0217 14:52:43.577270 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.582861 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.582899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.582913 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.582928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.582943 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:43Z","lastTransitionTime":"2026-02-17T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:43 crc kubenswrapper[4717]: E0217 14:52:43.595740 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.600496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.600542 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.600552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.600570 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.600579 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:43Z","lastTransitionTime":"2026-02-17T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:43 crc kubenswrapper[4717]: E0217 14:52:43.616647 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.620236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.620293 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.620314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.620342 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.620365 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:43Z","lastTransitionTime":"2026-02-17T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:43 crc kubenswrapper[4717]: E0217 14:52:43.634956 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.637817 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.637837 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.637844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.637854 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.637863 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:43Z","lastTransitionTime":"2026-02-17T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:43 crc kubenswrapper[4717]: E0217 14:52:43.650676 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:43 crc kubenswrapper[4717]: E0217 14:52:43.650785 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.652157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.652190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.652201 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.652215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.652228 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:43Z","lastTransitionTime":"2026-02-17T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.755537 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.755614 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.755637 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.755671 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.755694 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:43Z","lastTransitionTime":"2026-02-17T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.812142 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 09:35:21.793512143 +0000 UTC Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.846057 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:43 crc kubenswrapper[4717]: E0217 14:52:43.846226 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.846250 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:43 crc kubenswrapper[4717]: E0217 14:52:43.846334 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.846355 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:43 crc kubenswrapper[4717]: E0217 14:52:43.846610 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.858612 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.858650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.858662 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.858682 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.858694 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:43Z","lastTransitionTime":"2026-02-17T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.962450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.962483 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.962498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.962517 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:43 crc kubenswrapper[4717]: I0217 14:52:43.962532 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:43Z","lastTransitionTime":"2026-02-17T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.049636 4717 generic.go:334] "Generic (PLEG): container finished" podID="db1be0ab-a28a-4d7a-b871-d9fc8dea5841" containerID="c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73" exitCode=0 Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.049698 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" event={"ID":"db1be0ab-a28a-4d7a-b871-d9fc8dea5841","Type":"ContainerDied","Data":"c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73"} Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.068997 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.069878 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.069924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.069939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.069956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.069966 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:44Z","lastTransitionTime":"2026-02-17T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.085869 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.115428 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.142924 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.162961 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.172348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.172379 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.172388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.172402 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.172413 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:44Z","lastTransitionTime":"2026-02-17T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.179845 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.192538 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.203987 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.216627 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.231596 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.248199 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.263039 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.276918 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.276965 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.276977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.276995 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.277007 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:44Z","lastTransitionTime":"2026-02-17T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.292284 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.307870 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.322005 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.333295 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.348686 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.364770 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.375584 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.379044 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.379070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.379092 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.379108 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.379116 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:44Z","lastTransitionTime":"2026-02-17T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.385839 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.396730 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.407803 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.421113 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.437303 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.455986 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.471888 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.481724 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.481776 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.481789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.481810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.481825 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:44Z","lastTransitionTime":"2026-02-17T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.484866 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.502356 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.584589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.585027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.585332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.585512 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.585720 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:44Z","lastTransitionTime":"2026-02-17T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.688165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.688213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.688222 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.688240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.688250 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:44Z","lastTransitionTime":"2026-02-17T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.790775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.791078 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.791200 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.791293 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.791385 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:44Z","lastTransitionTime":"2026-02-17T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.812320 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 18:21:01.378579555 +0000 UTC Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.894098 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.894398 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.894467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.894528 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.894586 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:44Z","lastTransitionTime":"2026-02-17T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.998243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.998307 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.998328 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.998354 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:44 crc kubenswrapper[4717]: I0217 14:52:44.998373 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:44Z","lastTransitionTime":"2026-02-17T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.057126 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" event={"ID":"db1be0ab-a28a-4d7a-b871-d9fc8dea5841","Type":"ContainerStarted","Data":"cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1"} Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.065594 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771"} Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.085943 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.101153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.101429 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.101576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.101712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.101842 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:45Z","lastTransitionTime":"2026-02-17T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.105820 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.120862 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.135325 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.156370 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.175566 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.193866 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.204427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.204500 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.204509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.204526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.204537 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:45Z","lastTransitionTime":"2026-02-17T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.208323 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.223448 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.257730 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.279590 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.300173 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.306997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.307044 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.307063 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.307115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.307137 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:45Z","lastTransitionTime":"2026-02-17T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.322659 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.329158 4717 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.329768 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-operator/pods/iptables-alerter-4ln5h/status\": read tcp 38.102.83.74:47658->38.102.83.74:6443: use of closed network connection" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.410743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.410801 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.410812 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.410827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.410837 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:45Z","lastTransitionTime":"2026-02-17T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.514215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.514285 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.514307 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.514337 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.514361 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:45Z","lastTransitionTime":"2026-02-17T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.544276 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.544737 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:52:53.544671155 +0000 UTC m=+39.960511671 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.618480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.618537 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.618555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.618587 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.618606 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:45Z","lastTransitionTime":"2026-02-17T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.645908 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.645987 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.646023 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.646059 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.646139 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.646205 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.646244 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.646273 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.646277 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:53.646244539 +0000 UTC m=+40.062085055 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.646290 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.646311 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:53.646296771 +0000 UTC m=+40.062137277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.646366 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:53.646340732 +0000 UTC m=+40.062181408 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.646244 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.646392 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.646404 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.646436 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:53.646427084 +0000 UTC m=+40.062267810 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.722643 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.722742 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.722761 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.722787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.722807 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:45Z","lastTransitionTime":"2026-02-17T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.820931 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 23:03:24.880808185 +0000 UTC Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.825601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.825668 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.825689 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.825715 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.825735 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:45Z","lastTransitionTime":"2026-02-17T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.846374 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.846445 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.846731 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.846787 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.846990 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:52:45 crc kubenswrapper[4717]: E0217 14:52:45.847180 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.879160 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.900504 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.926310 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.927780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.927841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.927857 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.927876 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.927889 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:45Z","lastTransitionTime":"2026-02-17T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.948609 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.968757 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:45 crc kubenswrapper[4717]: I0217 14:52:45.983227 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.000866 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.016200 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.030839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.030891 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.030902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.030924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.030938 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:46Z","lastTransitionTime":"2026-02-17T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.036505 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.052700 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.065351 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.073461 4717 generic.go:334] "Generic (PLEG): container finished" podID="db1be0ab-a28a-4d7a-b871-d9fc8dea5841" containerID="cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1" exitCode=0 Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.073502 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" event={"ID":"db1be0ab-a28a-4d7a-b871-d9fc8dea5841","Type":"ContainerDied","Data":"cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1"} Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.091288 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.110036 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.126709 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.133649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.133689 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.133703 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.133723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.133736 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:46Z","lastTransitionTime":"2026-02-17T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.140736 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.156613 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.175208 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.190747 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.209452 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.222972 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.236206 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.236251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.236262 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.236281 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.236294 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:46Z","lastTransitionTime":"2026-02-17T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.239655 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.252674 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.267107 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.281055 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.294337 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.339604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.339738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.339770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.339814 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.339839 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:46Z","lastTransitionTime":"2026-02-17T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.443773 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.443839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.443859 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.443888 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.444021 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:46Z","lastTransitionTime":"2026-02-17T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.547558 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.547609 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.547623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.547645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.547659 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:46Z","lastTransitionTime":"2026-02-17T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.651366 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.651474 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.651496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.651524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.651541 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:46Z","lastTransitionTime":"2026-02-17T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.732810 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.755632 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.755973 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.756006 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.756023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.756048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.756065 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:46Z","lastTransitionTime":"2026-02-17T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.769266 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.821465 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 13:08:28.574279324 +0000 UTC Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.858951 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.859019 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.859037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.859062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.859101 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:46Z","lastTransitionTime":"2026-02-17T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.962251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.962530 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.962543 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.962562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:46 crc kubenswrapper[4717]: I0217 14:52:46.962574 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:46Z","lastTransitionTime":"2026-02-17T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.064727 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.064770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.064786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.064808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.064824 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:47Z","lastTransitionTime":"2026-02-17T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.083544 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb"} Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.084128 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.084232 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.089058 4717 generic.go:334] "Generic (PLEG): container finished" podID="db1be0ab-a28a-4d7a-b871-d9fc8dea5841" containerID="4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20" exitCode=0 Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.089146 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" event={"ID":"db1be0ab-a28a-4d7a-b871-d9fc8dea5841","Type":"ContainerDied","Data":"4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20"} Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.099371 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.113215 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.115311 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.118934 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.132961 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.146946 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.161811 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.168007 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.168059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.168073 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.168115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.168129 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:47Z","lastTransitionTime":"2026-02-17T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.174251 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.187455 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.197689 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.209938 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.224632 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.240010 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.255612 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.268680 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.270784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.270838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.270851 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.270871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.271306 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:47Z","lastTransitionTime":"2026-02-17T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.287860 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.307554 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.322883 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.341034 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.357768 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.374224 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.374276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.374290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.374308 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.374322 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:47Z","lastTransitionTime":"2026-02-17T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.384526 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.398148 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.412021 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.433608 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.456103 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.472012 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.515610 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.518359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.518419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.518436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.518460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.518478 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:47Z","lastTransitionTime":"2026-02-17T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.533152 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.546040 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.560662 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.621258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.621572 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.621684 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.621760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.621828 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:47Z","lastTransitionTime":"2026-02-17T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.725182 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.725236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.725248 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.725270 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.725282 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:47Z","lastTransitionTime":"2026-02-17T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.822338 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 22:55:45.403916375 +0000 UTC Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.827653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.827733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.827757 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.827790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.827819 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:47Z","lastTransitionTime":"2026-02-17T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.846319 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.846424 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:47 crc kubenswrapper[4717]: E0217 14:52:47.846467 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.846321 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:47 crc kubenswrapper[4717]: E0217 14:52:47.846587 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:52:47 crc kubenswrapper[4717]: E0217 14:52:47.846752 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.930326 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.930417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.930443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.930466 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:47 crc kubenswrapper[4717]: I0217 14:52:47.930482 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:47Z","lastTransitionTime":"2026-02-17T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.032616 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.032674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.032693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.032715 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.032732 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:48Z","lastTransitionTime":"2026-02-17T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.097587 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" event={"ID":"db1be0ab-a28a-4d7a-b871-d9fc8dea5841","Type":"ContainerStarted","Data":"2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9"} Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.097658 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.114551 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.134478 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.134964 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.134990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.134998 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.135012 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.135022 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:48Z","lastTransitionTime":"2026-02-17T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.151297 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.167720 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.188753 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.204129 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.223846 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.239230 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.239276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.239286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.239303 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.239315 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:48Z","lastTransitionTime":"2026-02-17T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.244836 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.263155 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.280836 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.297575 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.321676 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.339674 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.342680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.342726 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.342740 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.342775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.342789 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:48Z","lastTransitionTime":"2026-02-17T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.352539 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.445557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.445634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.445651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.445672 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.445685 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:48Z","lastTransitionTime":"2026-02-17T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.558610 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.558677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.558692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.558717 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.558733 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:48Z","lastTransitionTime":"2026-02-17T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.661309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.661357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.661366 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.661384 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.661394 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:48Z","lastTransitionTime":"2026-02-17T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.763593 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.763651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.763659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.763674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.763682 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:48Z","lastTransitionTime":"2026-02-17T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.822635 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 06:58:00.859479983 +0000 UTC Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.866400 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.866451 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.866468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.866490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.866507 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:48Z","lastTransitionTime":"2026-02-17T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.968495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.968529 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.968538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.968552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:48 crc kubenswrapper[4717]: I0217 14:52:48.968561 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:48Z","lastTransitionTime":"2026-02-17T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.071712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.071772 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.071785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.071805 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.071818 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:49Z","lastTransitionTime":"2026-02-17T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.102104 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.175552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.175634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.175652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.175681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.175728 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:49Z","lastTransitionTime":"2026-02-17T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.279164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.279236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.279261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.279292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.279317 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:49Z","lastTransitionTime":"2026-02-17T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.385452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.385527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.385552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.385627 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.385648 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:49Z","lastTransitionTime":"2026-02-17T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.488462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.488820 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.488965 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.489058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.489168 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:49Z","lastTransitionTime":"2026-02-17T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.592532 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.592596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.592608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.592635 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.592647 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:49Z","lastTransitionTime":"2026-02-17T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.694895 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.694964 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.694985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.695008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.695021 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:49Z","lastTransitionTime":"2026-02-17T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.798427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.798501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.798529 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.798562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.798586 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:49Z","lastTransitionTime":"2026-02-17T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.823519 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 04:06:20.858786699 +0000 UTC Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.846032 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.846173 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.846076 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:49 crc kubenswrapper[4717]: E0217 14:52:49.846280 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:52:49 crc kubenswrapper[4717]: E0217 14:52:49.846489 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:52:49 crc kubenswrapper[4717]: E0217 14:52:49.846624 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.901151 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.901189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.901203 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.901220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:49 crc kubenswrapper[4717]: I0217 14:52:49.901233 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:49Z","lastTransitionTime":"2026-02-17T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.003828 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.003909 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.003934 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.003967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.003991 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:50Z","lastTransitionTime":"2026-02-17T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.106313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.106341 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.106349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.106363 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.106372 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:50Z","lastTransitionTime":"2026-02-17T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.209051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.209149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.209169 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.209190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.209208 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:50Z","lastTransitionTime":"2026-02-17T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.311415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.311467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.311482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.311514 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.311529 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:50Z","lastTransitionTime":"2026-02-17T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.414368 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.414424 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.414440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.414461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.414479 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:50Z","lastTransitionTime":"2026-02-17T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.517916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.517960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.517970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.517987 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.518000 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:50Z","lastTransitionTime":"2026-02-17T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.621454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.621508 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.621518 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.621536 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.621549 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:50Z","lastTransitionTime":"2026-02-17T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.724789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.724847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.724865 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.724889 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.724907 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:50Z","lastTransitionTime":"2026-02-17T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.824328 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 08:16:24.094356577 +0000 UTC Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.828000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.828048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.828072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.828137 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.828161 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:50Z","lastTransitionTime":"2026-02-17T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.931601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.931658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.931673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.931696 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:50 crc kubenswrapper[4717]: I0217 14:52:50.931711 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:50Z","lastTransitionTime":"2026-02-17T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.035221 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.035266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.035277 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.035294 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.035306 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:51Z","lastTransitionTime":"2026-02-17T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.112434 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/0.log" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.116033 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb" exitCode=1 Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.116129 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb"} Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.117697 4717 scope.go:117] "RemoveContainer" containerID="abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.138603 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.138674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.138686 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.138707 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.138718 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:51Z","lastTransitionTime":"2026-02-17T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.140120 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.156502 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.174388 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.190787 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.213703 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:50Z\\\",\\\"message\\\":\\\"\\\\nI0217 14:52:50.236856 5999 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 14:52:50.238576 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 14:52:50.238596 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:52:50.239949 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:52:50.239988 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:52:50.239993 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 14:52:50.240014 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:52:50.240034 5999 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:52:50.240038 5999 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:52:50.240083 5999 factory.go:656] Stopping watch factory\\\\nI0217 14:52:50.240120 5999 ovnkube.go:599] Stopped ovnkube\\\\nI0217 14:52:50.240153 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 14:52:50.240165 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:52:50.240171 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 14:52:50.240178 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:52:50.240184 5999 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.229787 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.241938 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.242026 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.242053 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.242123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.242152 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:51Z","lastTransitionTime":"2026-02-17T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.250619 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.268023 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.286421 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.304054 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.322862 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.336361 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.345146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.345199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.345215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.345242 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.345259 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:51Z","lastTransitionTime":"2026-02-17T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.352506 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.374503 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.447990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.448110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.448134 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.448162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.448187 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:51Z","lastTransitionTime":"2026-02-17T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.552502 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.552576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.552596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.552625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.552645 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:51Z","lastTransitionTime":"2026-02-17T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.655811 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.655889 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.655913 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.655949 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.655977 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:51Z","lastTransitionTime":"2026-02-17T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.758622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.758675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.758687 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.758709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.758721 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:51Z","lastTransitionTime":"2026-02-17T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.824886 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 20:02:45.91932306 +0000 UTC Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.846434 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:51 crc kubenswrapper[4717]: E0217 14:52:51.846588 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.846441 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.846683 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:51 crc kubenswrapper[4717]: E0217 14:52:51.846771 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:52:51 crc kubenswrapper[4717]: E0217 14:52:51.846828 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.861870 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.861908 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.861923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.861943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.861953 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:51Z","lastTransitionTime":"2026-02-17T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.964971 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.965013 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.965022 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.965039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:51 crc kubenswrapper[4717]: I0217 14:52:51.965048 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:51Z","lastTransitionTime":"2026-02-17T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.068150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.068184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.068194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.068207 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.068218 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:52Z","lastTransitionTime":"2026-02-17T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.126440 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/0.log" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.130163 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d"} Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.130380 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.149282 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.170741 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.170791 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.170804 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.170824 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.170837 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:52Z","lastTransitionTime":"2026-02-17T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.187034 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:50Z\\\",\\\"message\\\":\\\"\\\\nI0217 14:52:50.236856 5999 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 14:52:50.238576 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 14:52:50.238596 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:52:50.239949 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:52:50.239988 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:52:50.239993 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 14:52:50.240014 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:52:50.240034 5999 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:52:50.240038 5999 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:52:50.240083 5999 factory.go:656] Stopping watch factory\\\\nI0217 14:52:50.240120 5999 ovnkube.go:599] Stopped ovnkube\\\\nI0217 14:52:50.240153 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 14:52:50.240165 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:52:50.240171 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 14:52:50.240178 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:52:50.240184 5999 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.204160 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.221825 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.236988 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.250948 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.264342 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.273169 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.273215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.273224 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.273242 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.273254 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:52Z","lastTransitionTime":"2026-02-17T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.277522 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.288204 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.298907 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.312657 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.339989 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.351984 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.363650 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.375723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.375778 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.375790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.375809 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.375820 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:52Z","lastTransitionTime":"2026-02-17T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.477564 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf"] Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.478043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.478085 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.478134 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.478152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.478164 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:52Z","lastTransitionTime":"2026-02-17T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.478448 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.481240 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.481485 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.500677 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.515898 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.529230 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/757e9eb2-a664-4fca-b745-5c3152a4c613-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4hbqf\" (UID: \"757e9eb2-a664-4fca-b745-5c3152a4c613\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.529311 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/757e9eb2-a664-4fca-b745-5c3152a4c613-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4hbqf\" (UID: \"757e9eb2-a664-4fca-b745-5c3152a4c613\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.529419 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7sx8\" (UniqueName: \"kubernetes.io/projected/757e9eb2-a664-4fca-b745-5c3152a4c613-kube-api-access-j7sx8\") pod \"ovnkube-control-plane-749d76644c-4hbqf\" (UID: \"757e9eb2-a664-4fca-b745-5c3152a4c613\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.529504 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/757e9eb2-a664-4fca-b745-5c3152a4c613-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4hbqf\" (UID: \"757e9eb2-a664-4fca-b745-5c3152a4c613\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.531021 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.545740 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.570108 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:50Z\\\",\\\"message\\\":\\\"\\\\nI0217 14:52:50.236856 5999 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 14:52:50.238576 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 14:52:50.238596 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:52:50.239949 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:52:50.239988 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:52:50.239993 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 14:52:50.240014 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:52:50.240034 5999 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:52:50.240038 5999 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:52:50.240083 5999 factory.go:656] Stopping watch factory\\\\nI0217 14:52:50.240120 5999 ovnkube.go:599] Stopped ovnkube\\\\nI0217 14:52:50.240153 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 14:52:50.240165 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:52:50.240171 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 14:52:50.240178 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:52:50.240184 5999 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.581135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.581170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.581179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.581207 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.581218 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:52Z","lastTransitionTime":"2026-02-17T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.588172 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.608486 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.624520 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.630897 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/757e9eb2-a664-4fca-b745-5c3152a4c613-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4hbqf\" (UID: \"757e9eb2-a664-4fca-b745-5c3152a4c613\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.630991 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/757e9eb2-a664-4fca-b745-5c3152a4c613-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4hbqf\" (UID: \"757e9eb2-a664-4fca-b745-5c3152a4c613\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.631033 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7sx8\" (UniqueName: \"kubernetes.io/projected/757e9eb2-a664-4fca-b745-5c3152a4c613-kube-api-access-j7sx8\") pod \"ovnkube-control-plane-749d76644c-4hbqf\" (UID: \"757e9eb2-a664-4fca-b745-5c3152a4c613\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.631073 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/757e9eb2-a664-4fca-b745-5c3152a4c613-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4hbqf\" (UID: \"757e9eb2-a664-4fca-b745-5c3152a4c613\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.631872 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/757e9eb2-a664-4fca-b745-5c3152a4c613-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4hbqf\" (UID: \"757e9eb2-a664-4fca-b745-5c3152a4c613\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.632020 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/757e9eb2-a664-4fca-b745-5c3152a4c613-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4hbqf\" (UID: \"757e9eb2-a664-4fca-b745-5c3152a4c613\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.639533 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/757e9eb2-a664-4fca-b745-5c3152a4c613-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4hbqf\" (UID: \"757e9eb2-a664-4fca-b745-5c3152a4c613\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.641698 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.652428 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7sx8\" (UniqueName: \"kubernetes.io/projected/757e9eb2-a664-4fca-b745-5c3152a4c613-kube-api-access-j7sx8\") pod \"ovnkube-control-plane-749d76644c-4hbqf\" (UID: \"757e9eb2-a664-4fca-b745-5c3152a4c613\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.662528 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.678868 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.683850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.683888 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.683898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.683914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.683927 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:52Z","lastTransitionTime":"2026-02-17T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.690029 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.702914 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.713024 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.728576 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.787896 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.788107 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.788147 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.788184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.788201 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:52Z","lastTransitionTime":"2026-02-17T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.798524 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" Feb 17 14:52:52 crc kubenswrapper[4717]: W0217 14:52:52.821453 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757e9eb2_a664_4fca_b745_5c3152a4c613.slice/crio-d19f0cdc705582b54cbf72faf99ddcb9989f0fdb8d4cd7a2f5067383f7e79850 WatchSource:0}: Error finding container d19f0cdc705582b54cbf72faf99ddcb9989f0fdb8d4cd7a2f5067383f7e79850: Status 404 returned error can't find the container with id d19f0cdc705582b54cbf72faf99ddcb9989f0fdb8d4cd7a2f5067383f7e79850 Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.825250 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 12:05:31.082841828 +0000 UTC Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.891046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.891119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.891132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.891156 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.891167 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:52Z","lastTransitionTime":"2026-02-17T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.993299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.993347 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.993362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.993380 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:52 crc kubenswrapper[4717]: I0217 14:52:52.993392 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:52Z","lastTransitionTime":"2026-02-17T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.096831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.096872 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.096929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.096948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.096959 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:53Z","lastTransitionTime":"2026-02-17T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.141780 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" event={"ID":"757e9eb2-a664-4fca-b745-5c3152a4c613","Type":"ContainerStarted","Data":"1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.141856 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" event={"ID":"757e9eb2-a664-4fca-b745-5c3152a4c613","Type":"ContainerStarted","Data":"0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.141873 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" event={"ID":"757e9eb2-a664-4fca-b745-5c3152a4c613","Type":"ContainerStarted","Data":"d19f0cdc705582b54cbf72faf99ddcb9989f0fdb8d4cd7a2f5067383f7e79850"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.144755 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/1.log" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.145728 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/0.log" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.148492 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d" exitCode=1 Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.148533 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.148572 4717 scope.go:117] "RemoveContainer" containerID="abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.149418 4717 scope.go:117] "RemoveContainer" containerID="213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d" Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.149601 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.159131 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.172806 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.186968 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.198892 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.198918 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.198926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.198940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.198948 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:53Z","lastTransitionTime":"2026-02-17T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.204376 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.221731 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.237354 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.255298 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.275705 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.293081 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.301410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.301453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.301462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.301477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.301488 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:53Z","lastTransitionTime":"2026-02-17T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.307902 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.326675 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:50Z\\\",\\\"message\\\":\\\"\\\\nI0217 14:52:50.236856 5999 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 14:52:50.238576 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 14:52:50.238596 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:52:50.239949 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:52:50.239988 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:52:50.239993 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 14:52:50.240014 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:52:50.240034 5999 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:52:50.240038 5999 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:52:50.240083 5999 factory.go:656] Stopping watch factory\\\\nI0217 14:52:50.240120 5999 ovnkube.go:599] Stopped ovnkube\\\\nI0217 14:52:50.240153 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 14:52:50.240165 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:52:50.240171 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 14:52:50.240178 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:52:50.240184 5999 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.338942 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.350114 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.360455 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.371301 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.381731 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.395338 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.403398 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.403439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.403450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.403467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.403478 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:53Z","lastTransitionTime":"2026-02-17T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.408286 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.419467 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.444225 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:50Z\\\",\\\"message\\\":\\\"\\\\nI0217 14:52:50.236856 5999 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 14:52:50.238576 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 14:52:50.238596 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:52:50.239949 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:52:50.239988 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:52:50.239993 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 14:52:50.240014 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:52:50.240034 5999 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:52:50.240038 5999 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:52:50.240083 5999 factory.go:656] Stopping watch factory\\\\nI0217 14:52:50.240120 5999 ovnkube.go:599] Stopped ovnkube\\\\nI0217 14:52:50.240153 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 14:52:50.240165 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:52:50.240171 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 14:52:50.240178 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:52:50.240184 5999 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"message\\\":\\\"ind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nF0217 14:52:52.446157 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.462673 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.478279 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.494749 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.506068 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.506179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.506208 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.506245 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.506270 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:53Z","lastTransitionTime":"2026-02-17T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.510797 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.532120 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.550623 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.563742 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.573526 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.585332 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.596312 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.609658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.609702 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.609724 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.609750 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.609764 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:53Z","lastTransitionTime":"2026-02-17T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.642522 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.642652 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:53:09.642627517 +0000 UTC m=+56.058467983 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.712024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.712064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.712075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.712109 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.712120 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:53Z","lastTransitionTime":"2026-02-17T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.743968 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.744011 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.744035 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.744056 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.744139 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.744180 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.744194 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:53:09.74417628 +0000 UTC m=+56.160016756 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.744229 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:53:09.744215711 +0000 UTC m=+56.160056187 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.744311 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.744345 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.744341 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.744394 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.744409 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.744468 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:53:09.744451268 +0000 UTC m=+56.160291744 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.744363 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.744522 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:53:09.744516669 +0000 UTC m=+56.160357145 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.814601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.814638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.814647 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.814662 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.814671 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:53Z","lastTransitionTime":"2026-02-17T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.826350 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 02:51:08.525568247 +0000 UTC Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.847275 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.847397 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.847279 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.847494 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.847682 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.847811 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.918012 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.918119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.918137 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.918171 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.918209 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:53Z","lastTransitionTime":"2026-02-17T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.961749 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-pzb78"] Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.962608 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:52:53 crc kubenswrapper[4717]: E0217 14:52:53.962729 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.975294 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:53 crc kubenswrapper[4717]: I0217 14:52:53.988568 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.001805 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.007966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.008023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.008034 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.008051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.008063 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.015162 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: E0217 14:52:54.019341 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.022288 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.022332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.022345 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.022364 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.022375 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.026600 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: E0217 14:52:54.034721 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.036512 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.039289 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.039330 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.039339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.039354 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.039365 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.046929 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.047003 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65w24\" (UniqueName: \"kubernetes.io/projected/f31d30c1-1e4a-49d3-adef-767a88616f33-kube-api-access-65w24\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.050167 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: E0217 14:52:54.052045 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.056466 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.056524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.056537 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.056567 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.056580 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: E0217 14:52:54.069525 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.071527 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:50Z\\\",\\\"message\\\":\\\"\\\\nI0217 14:52:50.236856 5999 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 14:52:50.238576 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 14:52:50.238596 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:52:50.239949 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:52:50.239988 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:52:50.239993 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 14:52:50.240014 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:52:50.240034 5999 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:52:50.240038 5999 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:52:50.240083 5999 factory.go:656] Stopping watch factory\\\\nI0217 14:52:50.240120 5999 ovnkube.go:599] Stopped ovnkube\\\\nI0217 14:52:50.240153 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 14:52:50.240165 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:52:50.240171 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 14:52:50.240178 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:52:50.240184 5999 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"message\\\":\\\"ind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nF0217 14:52:52.446157 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.073769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.073803 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.073817 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.073836 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.073847 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.084272 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: E0217 14:52:54.085228 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: E0217 14:52:54.085396 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.087598 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.087706 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.087732 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.087769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.087796 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.102448 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.117141 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.131113 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.144754 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.147516 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65w24\" (UniqueName: \"kubernetes.io/projected/f31d30c1-1e4a-49d3-adef-767a88616f33-kube-api-access-65w24\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.147600 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:52:54 crc kubenswrapper[4717]: E0217 14:52:54.147700 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:52:54 crc kubenswrapper[4717]: E0217 14:52:54.147766 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs podName:f31d30c1-1e4a-49d3-adef-767a88616f33 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:54.647748454 +0000 UTC m=+41.063588950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs") pod "network-metrics-daemon-pzb78" (UID: "f31d30c1-1e4a-49d3-adef-767a88616f33") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.159022 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/1.log" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.159585 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.172534 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.174029 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65w24\" (UniqueName: \"kubernetes.io/projected/f31d30c1-1e4a-49d3-adef-767a88616f33-kube-api-access-65w24\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.187596 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.190266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.190312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.190323 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.190345 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.190355 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.293658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.293707 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.293717 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.293736 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.293756 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.396590 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.396619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.396627 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.396645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.396654 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.499769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.499824 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.499841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.499864 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.499881 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.602706 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.602827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.602871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.602897 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.602913 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.653317 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:52:54 crc kubenswrapper[4717]: E0217 14:52:54.653509 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:52:54 crc kubenswrapper[4717]: E0217 14:52:54.653582 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs podName:f31d30c1-1e4a-49d3-adef-767a88616f33 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:55.653559889 +0000 UTC m=+42.069400385 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs") pod "network-metrics-daemon-pzb78" (UID: "f31d30c1-1e4a-49d3-adef-767a88616f33") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.706395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.706468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.706485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.706509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.706525 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.808923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.808961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.808970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.808986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.808996 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.827454 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 14:12:48.543890894 +0000 UTC Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.847378 4717 scope.go:117] "RemoveContainer" containerID="2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.911991 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.912027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.912041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.912060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:54 crc kubenswrapper[4717]: I0217 14:52:54.912073 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:54Z","lastTransitionTime":"2026-02-17T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.014654 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.014692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.014700 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.014714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.014722 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:55Z","lastTransitionTime":"2026-02-17T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.117289 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.117331 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.117346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.117365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.117381 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:55Z","lastTransitionTime":"2026-02-17T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.172312 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.174982 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8"} Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.175619 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.188420 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.201822 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.222348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.222434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.222456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.222484 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.222504 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:55Z","lastTransitionTime":"2026-02-17T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.223237 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.245294 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.258048 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.269953 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.282598 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.296895 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.310246 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.324626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.324666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.324674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.324688 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.324697 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:55Z","lastTransitionTime":"2026-02-17T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.325664 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.342655 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.368000 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:50Z\\\",\\\"message\\\":\\\"\\\\nI0217 14:52:50.236856 5999 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 14:52:50.238576 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 14:52:50.238596 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:52:50.239949 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:52:50.239988 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:52:50.239993 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 14:52:50.240014 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:52:50.240034 5999 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:52:50.240038 5999 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:52:50.240083 5999 factory.go:656] Stopping watch factory\\\\nI0217 14:52:50.240120 5999 ovnkube.go:599] Stopped ovnkube\\\\nI0217 14:52:50.240153 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 14:52:50.240165 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:52:50.240171 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 14:52:50.240178 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:52:50.240184 5999 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"message\\\":\\\"ind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nF0217 14:52:52.446157 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.381855 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.394531 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.413005 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.427627 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.427945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.427958 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.427974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.427984 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:55Z","lastTransitionTime":"2026-02-17T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.430032 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.530637 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.530696 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.530706 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.530725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.530736 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:55Z","lastTransitionTime":"2026-02-17T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.633937 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.634000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.634024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.634057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.634112 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:55Z","lastTransitionTime":"2026-02-17T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.665841 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:52:55 crc kubenswrapper[4717]: E0217 14:52:55.666046 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:52:55 crc kubenswrapper[4717]: E0217 14:52:55.666136 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs podName:f31d30c1-1e4a-49d3-adef-767a88616f33 nodeName:}" failed. No retries permitted until 2026-02-17 14:52:57.666112585 +0000 UTC m=+44.081953071 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs") pod "network-metrics-daemon-pzb78" (UID: "f31d30c1-1e4a-49d3-adef-767a88616f33") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.736604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.736643 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.736652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.736667 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.736678 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:55Z","lastTransitionTime":"2026-02-17T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.827646 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 03:46:04.620879996 +0000 UTC Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.840268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.840320 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.840331 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.840348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.840358 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:55Z","lastTransitionTime":"2026-02-17T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.845924 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.846000 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.846023 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:55 crc kubenswrapper[4717]: E0217 14:52:55.846221 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.846542 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:55 crc kubenswrapper[4717]: E0217 14:52:55.846666 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:52:55 crc kubenswrapper[4717]: E0217 14:52:55.846829 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:52:55 crc kubenswrapper[4717]: E0217 14:52:55.847039 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.862476 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.876152 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.887691 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.899738 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.916381 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.932125 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.942539 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.942588 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.942600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.942620 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.942634 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:55Z","lastTransitionTime":"2026-02-17T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.947039 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:55 crc kubenswrapper[4717]: I0217 14:52:55.986394 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.014485 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.034272 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.044404 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.044468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.044481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.044499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.044511 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:56Z","lastTransitionTime":"2026-02-17T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.047232 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.064890 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abf2b8588ab40651e9668c3c660f37abd6646a37ba15b47adb12e82aae0f1bfb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:50Z\\\",\\\"message\\\":\\\"\\\\nI0217 14:52:50.236856 5999 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 14:52:50.238576 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 14:52:50.238596 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:52:50.239949 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:52:50.239988 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:52:50.239993 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 14:52:50.240014 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:52:50.240034 5999 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:52:50.240038 5999 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:52:50.240083 5999 factory.go:656] Stopping watch factory\\\\nI0217 14:52:50.240120 5999 ovnkube.go:599] Stopped ovnkube\\\\nI0217 14:52:50.240153 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 14:52:50.240165 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:52:50.240171 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 14:52:50.240178 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:52:50.240184 5999 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"message\\\":\\\"ind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nF0217 14:52:52.446157 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.075137 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.094317 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.107299 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.118255 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:52:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.149883 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.149931 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.149940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.149965 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.149975 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:56Z","lastTransitionTime":"2026-02-17T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.252255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.252333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.252362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.252399 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.252424 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:56Z","lastTransitionTime":"2026-02-17T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.355622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.355680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.355697 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.355723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.355741 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:56Z","lastTransitionTime":"2026-02-17T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.459308 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.459374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.459392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.459420 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.459442 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:56Z","lastTransitionTime":"2026-02-17T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.563230 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.563323 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.563346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.563378 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.563409 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:56Z","lastTransitionTime":"2026-02-17T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.667246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.667310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.667329 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.667355 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.667374 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:56Z","lastTransitionTime":"2026-02-17T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.770325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.770446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.770474 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.770503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.770526 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:56Z","lastTransitionTime":"2026-02-17T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.828728 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 19:54:47.643827403 +0000 UTC Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.873597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.873656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.873668 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.873686 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.873698 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:56Z","lastTransitionTime":"2026-02-17T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.976436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.976499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.976509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.976529 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:56 crc kubenswrapper[4717]: I0217 14:52:56.976555 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:56Z","lastTransitionTime":"2026-02-17T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.079005 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.079058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.079067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.079098 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.079110 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:57Z","lastTransitionTime":"2026-02-17T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.181793 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.181845 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.181858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.181878 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.181903 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:57Z","lastTransitionTime":"2026-02-17T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.285243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.285301 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.285314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.285336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.285353 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:57Z","lastTransitionTime":"2026-02-17T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.389029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.389167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.389189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.389218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.389237 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:57Z","lastTransitionTime":"2026-02-17T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.492039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.492132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.492152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.492181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.492202 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:57Z","lastTransitionTime":"2026-02-17T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.595381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.595487 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.595509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.595541 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.595568 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:57Z","lastTransitionTime":"2026-02-17T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.691913 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:52:57 crc kubenswrapper[4717]: E0217 14:52:57.692115 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:52:57 crc kubenswrapper[4717]: E0217 14:52:57.692186 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs podName:f31d30c1-1e4a-49d3-adef-767a88616f33 nodeName:}" failed. No retries permitted until 2026-02-17 14:53:01.692170733 +0000 UTC m=+48.108011209 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs") pod "network-metrics-daemon-pzb78" (UID: "f31d30c1-1e4a-49d3-adef-767a88616f33") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.698675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.698767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.698789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.698820 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.698845 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:57Z","lastTransitionTime":"2026-02-17T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.805568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.805643 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.805661 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.805690 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.805722 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:57Z","lastTransitionTime":"2026-02-17T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.829248 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 17:50:46.680355872 +0000 UTC Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.845956 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.846070 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.846177 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.846427 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:57 crc kubenswrapper[4717]: E0217 14:52:57.846411 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:52:57 crc kubenswrapper[4717]: E0217 14:52:57.846622 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:52:57 crc kubenswrapper[4717]: E0217 14:52:57.846848 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:52:57 crc kubenswrapper[4717]: E0217 14:52:57.847150 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.908440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.908497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.908514 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.908546 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:57 crc kubenswrapper[4717]: I0217 14:52:57.908565 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:57Z","lastTransitionTime":"2026-02-17T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.011513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.011559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.011575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.011596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.011611 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:58Z","lastTransitionTime":"2026-02-17T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.114806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.114884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.114901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.114929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.114951 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:58Z","lastTransitionTime":"2026-02-17T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.219046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.219162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.219187 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.219217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.219243 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:58Z","lastTransitionTime":"2026-02-17T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.322602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.322643 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.322653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.322674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.322687 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:58Z","lastTransitionTime":"2026-02-17T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.425789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.425859 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.425873 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.425889 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.425900 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:58Z","lastTransitionTime":"2026-02-17T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.528911 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.528948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.528957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.528974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.528984 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:58Z","lastTransitionTime":"2026-02-17T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.631789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.631846 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.631865 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.631885 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.631901 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:58Z","lastTransitionTime":"2026-02-17T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.735054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.735135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.735148 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.735170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.735187 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:58Z","lastTransitionTime":"2026-02-17T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.829377 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 09:20:37.670485108 +0000 UTC Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.837172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.837244 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.837287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.837310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.837323 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:58Z","lastTransitionTime":"2026-02-17T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.940634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.940699 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.940715 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.940744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:58 crc kubenswrapper[4717]: I0217 14:52:58.940765 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:58Z","lastTransitionTime":"2026-02-17T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.043629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.043686 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.043703 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.043727 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.043743 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:59Z","lastTransitionTime":"2026-02-17T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.147472 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.147523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.147540 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.147566 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.147588 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:59Z","lastTransitionTime":"2026-02-17T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.251230 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.251290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.251304 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.251325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.251343 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:59Z","lastTransitionTime":"2026-02-17T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.355428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.355479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.355492 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.355514 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.355526 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:59Z","lastTransitionTime":"2026-02-17T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.459777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.460336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.460523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.460685 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.460820 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:59Z","lastTransitionTime":"2026-02-17T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.564496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.564552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.564568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.564590 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.564606 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:59Z","lastTransitionTime":"2026-02-17T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.668485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.668560 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.668585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.668619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.668648 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:59Z","lastTransitionTime":"2026-02-17T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.773075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.773192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.773215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.773253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.773276 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:59Z","lastTransitionTime":"2026-02-17T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.829559 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 22:50:58.099060433 +0000 UTC Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.846169 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.846037 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.846190 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:52:59 crc kubenswrapper[4717]: E0217 14:52:59.846399 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.846735 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:52:59 crc kubenswrapper[4717]: E0217 14:52:59.847034 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:52:59 crc kubenswrapper[4717]: E0217 14:52:59.847334 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:52:59 crc kubenswrapper[4717]: E0217 14:52:59.847497 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.876418 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.876481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.876499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.876525 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.876544 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:59Z","lastTransitionTime":"2026-02-17T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.980448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.980513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.980531 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.980557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:52:59 crc kubenswrapper[4717]: I0217 14:52:59.980575 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:52:59Z","lastTransitionTime":"2026-02-17T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.084150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.084213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.084232 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.084260 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.084278 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:00Z","lastTransitionTime":"2026-02-17T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.187812 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.187887 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.187908 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.187944 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.187969 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:00Z","lastTransitionTime":"2026-02-17T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.291200 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.291275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.291299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.291335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.291358 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:00Z","lastTransitionTime":"2026-02-17T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.394716 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.394807 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.394827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.394856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.394874 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:00Z","lastTransitionTime":"2026-02-17T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.498668 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.498763 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.498788 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.498853 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.498875 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:00Z","lastTransitionTime":"2026-02-17T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.602725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.602911 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.602943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.603545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.603731 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:00Z","lastTransitionTime":"2026-02-17T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.707132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.707215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.707241 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.707274 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.707306 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:00Z","lastTransitionTime":"2026-02-17T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.811004 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.811127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.811155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.811188 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.811210 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:00Z","lastTransitionTime":"2026-02-17T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.830521 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:12:31.211288961 +0000 UTC Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.914307 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.914904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.915072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.915299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:00 crc kubenswrapper[4717]: I0217 14:53:00.915702 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:00Z","lastTransitionTime":"2026-02-17T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.019480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.019552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.019570 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.019595 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.019620 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:01Z","lastTransitionTime":"2026-02-17T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.122228 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.122284 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.122296 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.122315 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.122330 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:01Z","lastTransitionTime":"2026-02-17T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.226074 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.226192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.226381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.226421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.226449 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:01Z","lastTransitionTime":"2026-02-17T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.329853 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.330423 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.330597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.330749 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.330904 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:01Z","lastTransitionTime":"2026-02-17T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.434517 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.434560 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.434569 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.434586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.434596 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:01Z","lastTransitionTime":"2026-02-17T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.538110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.538177 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.538190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.538215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.538235 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:01Z","lastTransitionTime":"2026-02-17T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.641581 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.641662 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.641681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.641715 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.641740 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:01Z","lastTransitionTime":"2026-02-17T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.741000 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:01 crc kubenswrapper[4717]: E0217 14:53:01.741290 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:53:01 crc kubenswrapper[4717]: E0217 14:53:01.741460 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs podName:f31d30c1-1e4a-49d3-adef-767a88616f33 nodeName:}" failed. No retries permitted until 2026-02-17 14:53:09.741421661 +0000 UTC m=+56.157262257 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs") pod "network-metrics-daemon-pzb78" (UID: "f31d30c1-1e4a-49d3-adef-767a88616f33") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.746167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.746252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.746280 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.746314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.746334 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:01Z","lastTransitionTime":"2026-02-17T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.831706 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 19:19:14.194863884 +0000 UTC Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.847896 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.847984 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:01 crc kubenswrapper[4717]: E0217 14:53:01.848172 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.847910 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:01 crc kubenswrapper[4717]: E0217 14:53:01.848610 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:01 crc kubenswrapper[4717]: E0217 14:53:01.848932 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.849030 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:01 crc kubenswrapper[4717]: E0217 14:53:01.849332 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.851953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.852005 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.852029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.852064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.852079 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:01Z","lastTransitionTime":"2026-02-17T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.955581 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.955627 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.955640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.955659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:01 crc kubenswrapper[4717]: I0217 14:53:01.955680 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:01Z","lastTransitionTime":"2026-02-17T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.058920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.059012 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.059035 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.059056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.059071 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:02Z","lastTransitionTime":"2026-02-17T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.162530 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.162599 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.162623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.162655 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.162678 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:02Z","lastTransitionTime":"2026-02-17T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.265058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.265131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.265192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.265217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.265231 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:02Z","lastTransitionTime":"2026-02-17T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.367471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.367521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.367540 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.367565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.367587 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:02Z","lastTransitionTime":"2026-02-17T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.471125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.471200 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.471224 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.471257 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.471281 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:02Z","lastTransitionTime":"2026-02-17T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.573984 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.574030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.574043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.574057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.574066 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:02Z","lastTransitionTime":"2026-02-17T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.677214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.677330 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.677350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.677416 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.677434 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:02Z","lastTransitionTime":"2026-02-17T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.781026 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.781116 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.781141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.781170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.781190 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:02Z","lastTransitionTime":"2026-02-17T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.832716 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 15:42:59.869533141 +0000 UTC Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.884521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.884585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.884606 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.884630 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.884650 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:02Z","lastTransitionTime":"2026-02-17T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.988290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.988359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.988381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.988409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:02 crc kubenswrapper[4717]: I0217 14:53:02.988431 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:02Z","lastTransitionTime":"2026-02-17T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.091983 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.092119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.092141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.092169 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.092186 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:03Z","lastTransitionTime":"2026-02-17T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.194957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.195029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.195048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.195129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.195160 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:03Z","lastTransitionTime":"2026-02-17T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.298324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.298399 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.298420 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.298452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.298474 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:03Z","lastTransitionTime":"2026-02-17T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.401538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.401570 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.401578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.401591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.401600 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:03Z","lastTransitionTime":"2026-02-17T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.505130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.505197 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.505217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.505243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.505261 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:03Z","lastTransitionTime":"2026-02-17T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.608550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.608613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.608632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.608661 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.608681 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:03Z","lastTransitionTime":"2026-02-17T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.711812 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.711896 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.711920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.711953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.711976 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:03Z","lastTransitionTime":"2026-02-17T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.814743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.814807 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.814825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.814854 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.814873 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:03Z","lastTransitionTime":"2026-02-17T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.833394 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 22:41:50.837435084 +0000 UTC Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.845909 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.845951 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.845969 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.846072 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:03 crc kubenswrapper[4717]: E0217 14:53:03.846410 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:03 crc kubenswrapper[4717]: E0217 14:53:03.846547 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:03 crc kubenswrapper[4717]: E0217 14:53:03.846708 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:03 crc kubenswrapper[4717]: E0217 14:53:03.846802 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.918511 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.918577 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.918595 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.918625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:03 crc kubenswrapper[4717]: I0217 14:53:03.918644 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:03Z","lastTransitionTime":"2026-02-17T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.021189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.021236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.021249 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.021267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.021278 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.124435 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.124486 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.124495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.124512 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.124522 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.227243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.227296 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.227308 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.227330 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.227345 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.330738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.330808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.330823 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.330841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.330852 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.389955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.390042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.390064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.390131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.390244 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: E0217 14:53:04.411627 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.417268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.417346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.417365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.417392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.417411 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: E0217 14:53:04.440550 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.445433 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.445476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.445488 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.445508 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.445523 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: E0217 14:53:04.465763 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.471429 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.471471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.471483 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.471501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.471516 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: E0217 14:53:04.491540 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.496512 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.496569 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.496586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.496610 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.496628 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: E0217 14:53:04.518794 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:04 crc kubenswrapper[4717]: E0217 14:53:04.519021 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.521507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.521557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.521577 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.521603 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.521621 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.625542 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.625592 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.625606 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.625624 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.625636 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.729233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.729350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.729411 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.729444 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.729464 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.833386 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.833460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.833483 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.833510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.833529 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.833595 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 07:59:13.256807265 +0000 UTC Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.936419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.936477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.936495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.936517 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:04 crc kubenswrapper[4717]: I0217 14:53:04.936535 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:04Z","lastTransitionTime":"2026-02-17T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.039555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.039622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.039640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.039663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.039680 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:05Z","lastTransitionTime":"2026-02-17T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.142554 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.142632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.142653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.142681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.142699 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:05Z","lastTransitionTime":"2026-02-17T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.213217 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.214604 4717 scope.go:117] "RemoveContainer" containerID="213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.242563 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.245941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.245996 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.246020 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.246056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.246114 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:05Z","lastTransitionTime":"2026-02-17T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.268158 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.311373 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"message\\\":\\\"ind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nF0217 14:52:52.446157 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.326257 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.344557 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.350795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.350859 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.350882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.350910 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.350929 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:05Z","lastTransitionTime":"2026-02-17T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.371872 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.389446 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.403531 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.417303 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.434705 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.448314 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.453757 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.453808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.453826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.453852 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.453871 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:05Z","lastTransitionTime":"2026-02-17T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.465960 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.488568 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.509324 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.527930 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.546185 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.556730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.556787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.556806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.556833 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.556850 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:05Z","lastTransitionTime":"2026-02-17T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.659253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.659312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.659326 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.659351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.659368 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:05Z","lastTransitionTime":"2026-02-17T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.761926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.761980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.761990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.762010 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.762021 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:05Z","lastTransitionTime":"2026-02-17T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.834236 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 12:41:27.893781436 +0000 UTC Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.849351 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:05 crc kubenswrapper[4717]: E0217 14:53:05.849506 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.849606 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:05 crc kubenswrapper[4717]: E0217 14:53:05.849673 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.849735 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:05 crc kubenswrapper[4717]: E0217 14:53:05.849816 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.849892 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:05 crc kubenswrapper[4717]: E0217 14:53:05.849973 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.864204 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.864679 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.864726 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.864738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.864760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.864775 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:05Z","lastTransitionTime":"2026-02-17T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.880761 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.901719 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:05 crc kubenswrapper[4717]: I0217 14:53:05.945852 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.007025 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.007059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.007067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.007107 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.007118 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:06Z","lastTransitionTime":"2026-02-17T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.010120 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"message\\\":\\\"ind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nF0217 14:52:52.446157 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.030600 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.056777 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.072415 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.087253 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.105736 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.110266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.110298 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.110309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.110325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.110337 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:06Z","lastTransitionTime":"2026-02-17T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.119868 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.136307 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.148908 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.161278 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.171285 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.184462 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.217031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.217099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.217110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.217129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.217143 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:06Z","lastTransitionTime":"2026-02-17T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.220948 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/1.log" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.224203 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322"} Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.224708 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.236829 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.249518 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.260820 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.273437 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.284193 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.295920 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.308652 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.320936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.320990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.321002 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.321024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.321038 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:06Z","lastTransitionTime":"2026-02-17T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.322526 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.339578 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.352050 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.367450 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.387910 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"message\\\":\\\"ind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nF0217 14:52:52.446157 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:53:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.400225 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.413820 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.423855 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.423897 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.423908 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.423927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.423939 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:06Z","lastTransitionTime":"2026-02-17T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.431276 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.446833 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.527514 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.527578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.527594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.527620 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.527638 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:06Z","lastTransitionTime":"2026-02-17T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.630689 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.631191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.631416 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.631609 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.631796 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:06Z","lastTransitionTime":"2026-02-17T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.734664 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.734773 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.734796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.734826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.734848 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:06Z","lastTransitionTime":"2026-02-17T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.835036 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 09:00:06.940319664 +0000 UTC Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.837418 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.837478 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.837497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.837520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.837538 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:06Z","lastTransitionTime":"2026-02-17T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.940462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.940575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.940602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.940637 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:06 crc kubenswrapper[4717]: I0217 14:53:06.940663 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:06Z","lastTransitionTime":"2026-02-17T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.042979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.043018 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.043029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.043045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.043054 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:07Z","lastTransitionTime":"2026-02-17T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.146865 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.146935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.146956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.146983 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.147002 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:07Z","lastTransitionTime":"2026-02-17T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.231529 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/2.log" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.232601 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/1.log" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.236598 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322" exitCode=1 Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.236678 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322"} Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.236748 4717 scope.go:117] "RemoveContainer" containerID="213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.237722 4717 scope.go:117] "RemoveContainer" containerID="b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322" Feb 17 14:53:07 crc kubenswrapper[4717]: E0217 14:53:07.237957 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.249189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.249240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.249250 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.249273 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.249284 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:07Z","lastTransitionTime":"2026-02-17T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.258557 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.279262 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.301302 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.319478 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.344129 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://213b3000be60b58675df81c78b86b4f754f986acecbb9a96cf4fceef377cd82d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"message\\\":\\\"ind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nF0217 14:52:52.446157 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:06Z\\\",\\\"message\\\":\\\"twork=default: []services.LB{}\\\\nI0217 14:53:06.414305 6389 services_controller.go:454] Service openshift-machine-api/machine-api-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 14:53:06.413756 6389 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 14:53:06.414231 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:06.414283 6389 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI0217 14:53:06.413829 6389 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-xr7pz in node crc\\\\nI0217 14:53:06.414442 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-xr7pz after 0 failed attempt(s)\\\\nI0217 14:53:06.414448 6389 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-xr7pz\\\\nF0217 14:53:06.413704 6389 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.352635 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.352709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.352729 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.352786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.352805 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:07Z","lastTransitionTime":"2026-02-17T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.360403 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.381968 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.402222 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.418303 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.430207 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.446842 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.455189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.455230 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.455239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.455254 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.455264 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:07Z","lastTransitionTime":"2026-02-17T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.460131 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.470478 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.483966 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.498606 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.509099 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:07Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.558033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.558534 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.558740 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.558951 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.559124 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:07Z","lastTransitionTime":"2026-02-17T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.661702 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.662029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.662198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.662317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.662414 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:07Z","lastTransitionTime":"2026-02-17T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.765316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.765416 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.765432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.765452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.765464 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:07Z","lastTransitionTime":"2026-02-17T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.835843 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 05:17:24.376039565 +0000 UTC Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.846360 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.846446 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.846464 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.846361 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:07 crc kubenswrapper[4717]: E0217 14:53:07.846594 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:07 crc kubenswrapper[4717]: E0217 14:53:07.846715 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:07 crc kubenswrapper[4717]: E0217 14:53:07.846976 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:07 crc kubenswrapper[4717]: E0217 14:53:07.847130 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.868508 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.868846 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.868972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.869127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.869260 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:07Z","lastTransitionTime":"2026-02-17T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.972731 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.972775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.972789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.972813 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:07 crc kubenswrapper[4717]: I0217 14:53:07.972828 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:07Z","lastTransitionTime":"2026-02-17T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.075654 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.075714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.075733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.075761 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.075782 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:08Z","lastTransitionTime":"2026-02-17T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.178671 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.178739 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.178757 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.178787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.178849 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:08Z","lastTransitionTime":"2026-02-17T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.242747 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/2.log" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.247739 4717 scope.go:117] "RemoveContainer" containerID="b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322" Feb 17 14:53:08 crc kubenswrapper[4717]: E0217 14:53:08.248039 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.264358 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.280182 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.281457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.281584 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.281661 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.281747 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.281822 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:08Z","lastTransitionTime":"2026-02-17T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.297053 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.313819 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.327962 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.345593 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.370675 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:06Z\\\",\\\"message\\\":\\\"twork=default: []services.LB{}\\\\nI0217 14:53:06.414305 6389 services_controller.go:454] Service openshift-machine-api/machine-api-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 14:53:06.413756 6389 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 14:53:06.414231 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:06.414283 6389 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI0217 14:53:06.413829 6389 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-xr7pz in node crc\\\\nI0217 14:53:06.414442 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-xr7pz after 0 failed attempt(s)\\\\nI0217 14:53:06.414448 6389 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-xr7pz\\\\nF0217 14:53:06.413704 6389 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.385693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.385760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.385777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.385803 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.385820 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:08Z","lastTransitionTime":"2026-02-17T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.391898 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.403849 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.415036 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.428392 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.442518 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.457377 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.473722 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.486066 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.488149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.489037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.489125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.489153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.489170 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:08Z","lastTransitionTime":"2026-02-17T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.500940 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:08Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.591892 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.591948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.591967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.591988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.592003 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:08Z","lastTransitionTime":"2026-02-17T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.700346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.700406 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.700417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.700440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.700460 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:08Z","lastTransitionTime":"2026-02-17T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.803327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.803516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.803534 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.803555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.803567 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:08Z","lastTransitionTime":"2026-02-17T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.836032 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 18:25:31.636534526 +0000 UTC Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.906565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.906628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.906647 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.906673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.906692 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:08Z","lastTransitionTime":"2026-02-17T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:08 crc kubenswrapper[4717]: I0217 14:53:08.997448 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.009563 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.009594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.009603 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.009701 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.009712 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:09Z","lastTransitionTime":"2026-02-17T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.012786 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.026822 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.043494 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.062644 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.076803 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.091614 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.104173 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.114059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.114143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.114164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.114184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.114198 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:09Z","lastTransitionTime":"2026-02-17T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.117760 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.128255 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.142414 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.157445 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.173727 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.190000 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.205695 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.218685 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.218754 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.218767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.218790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.218808 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:09Z","lastTransitionTime":"2026-02-17T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.236400 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:06Z\\\",\\\"message\\\":\\\"twork=default: []services.LB{}\\\\nI0217 14:53:06.414305 6389 services_controller.go:454] Service openshift-machine-api/machine-api-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 14:53:06.413756 6389 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 14:53:06.414231 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:06.414283 6389 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI0217 14:53:06.413829 6389 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-xr7pz in node crc\\\\nI0217 14:53:06.414442 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-xr7pz after 0 failed attempt(s)\\\\nI0217 14:53:06.414448 6389 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-xr7pz\\\\nF0217 14:53:06.413704 6389 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.251751 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.321803 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.321858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.321873 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.321894 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.321909 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:09Z","lastTransitionTime":"2026-02-17T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.425255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.425352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.425369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.425396 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.425415 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:09Z","lastTransitionTime":"2026-02-17T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.529666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.529725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.529749 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.529775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.529798 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:09Z","lastTransitionTime":"2026-02-17T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.633437 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.633510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.633527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.633549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.633566 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:09Z","lastTransitionTime":"2026-02-17T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.648284 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.648686 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:53:41.648641775 +0000 UTC m=+88.064482301 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.736795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.736845 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.736858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.736878 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.736892 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:09Z","lastTransitionTime":"2026-02-17T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.749856 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.749936 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.749973 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.750005 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.750038 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750161 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750199 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750275 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:53:41.75024814 +0000 UTC m=+88.166088806 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750303 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:53:41.750292131 +0000 UTC m=+88.166132607 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750368 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750392 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750408 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750411 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750531 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750460 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:53:41.750437295 +0000 UTC m=+88.166277981 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750552 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750563 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750580 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs podName:f31d30c1-1e4a-49d3-adef-767a88616f33 nodeName:}" failed. No retries permitted until 2026-02-17 14:53:25.750553088 +0000 UTC m=+72.166393564 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs") pod "network-metrics-daemon-pzb78" (UID: "f31d30c1-1e4a-49d3-adef-767a88616f33") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.750600 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:53:41.750590419 +0000 UTC m=+88.166430895 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.837310 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 13:54:10.607060692 +0000 UTC Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.840528 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.840562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.840573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.840592 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.840609 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:09Z","lastTransitionTime":"2026-02-17T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.848280 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.848389 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.848746 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.848803 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.848794 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.848874 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.849003 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:09 crc kubenswrapper[4717]: E0217 14:53:09.849247 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.945133 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.945191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.945209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.945233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:09 crc kubenswrapper[4717]: I0217 14:53:09.945251 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:09Z","lastTransitionTime":"2026-02-17T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.048864 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.049503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.049541 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.049574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.049594 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:10Z","lastTransitionTime":"2026-02-17T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.153495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.153586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.153607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.153639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.153658 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:10Z","lastTransitionTime":"2026-02-17T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.256187 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.256236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.256254 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.256287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.256308 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:10Z","lastTransitionTime":"2026-02-17T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.363610 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.363675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.363692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.363719 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.363738 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:10Z","lastTransitionTime":"2026-02-17T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.467810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.467878 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.467895 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.467919 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.467936 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:10Z","lastTransitionTime":"2026-02-17T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.570841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.570904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.570917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.570945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.570959 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:10Z","lastTransitionTime":"2026-02-17T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.673901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.673939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.673952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.673972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.673985 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:10Z","lastTransitionTime":"2026-02-17T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.777431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.777477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.777504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.777528 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.777548 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:10Z","lastTransitionTime":"2026-02-17T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.838200 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 13:04:07.877150878 +0000 UTC Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.880383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.880431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.880449 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.880474 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.880493 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:10Z","lastTransitionTime":"2026-02-17T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.983234 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.983338 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.983354 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.983375 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:10 crc kubenswrapper[4717]: I0217 14:53:10.983391 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:10Z","lastTransitionTime":"2026-02-17T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.085769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.085831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.085850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.085870 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.085886 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:11Z","lastTransitionTime":"2026-02-17T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.188575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.188640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.188652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.188671 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.188684 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:11Z","lastTransitionTime":"2026-02-17T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.291919 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.292000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.292021 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.292050 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.292071 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:11Z","lastTransitionTime":"2026-02-17T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.395351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.395434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.395452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.395480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.395499 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:11Z","lastTransitionTime":"2026-02-17T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.499158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.499455 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.499518 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.499613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.499686 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:11Z","lastTransitionTime":"2026-02-17T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.603148 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.603458 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.603521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.603623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.603690 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:11Z","lastTransitionTime":"2026-02-17T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.705977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.706045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.706065 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.706114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.706130 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:11Z","lastTransitionTime":"2026-02-17T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.808333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.808728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.808739 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.808757 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.808768 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:11Z","lastTransitionTime":"2026-02-17T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.839037 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 16:17:32.36926831 +0000 UTC Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.846600 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.846655 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.846603 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:11 crc kubenswrapper[4717]: E0217 14:53:11.846736 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.846612 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:11 crc kubenswrapper[4717]: E0217 14:53:11.846867 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:11 crc kubenswrapper[4717]: E0217 14:53:11.846934 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:11 crc kubenswrapper[4717]: E0217 14:53:11.846984 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.911245 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.911291 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.911305 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.911333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:11 crc kubenswrapper[4717]: I0217 14:53:11.911344 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:11Z","lastTransitionTime":"2026-02-17T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.013285 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.013322 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.013331 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.013347 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.013356 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:12Z","lastTransitionTime":"2026-02-17T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.115680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.115985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.116059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.116217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.116303 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:12Z","lastTransitionTime":"2026-02-17T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.218857 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.218905 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.218920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.218941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.218959 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:12Z","lastTransitionTime":"2026-02-17T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.321114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.321160 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.321206 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.321228 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.321238 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:12Z","lastTransitionTime":"2026-02-17T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.423704 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.423782 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.423803 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.423831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.423852 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:12Z","lastTransitionTime":"2026-02-17T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.497926 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.526954 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.526994 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.527006 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.527022 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.527038 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:12Z","lastTransitionTime":"2026-02-17T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.630309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.630361 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.630373 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.630392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.630405 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:12Z","lastTransitionTime":"2026-02-17T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.733038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.733098 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.733113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.733132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.733145 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:12Z","lastTransitionTime":"2026-02-17T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.835972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.836028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.836039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.836060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.836073 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:12Z","lastTransitionTime":"2026-02-17T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.839208 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 06:41:05.376901927 +0000 UTC Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.935402 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.938858 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:12Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.939052 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.939544 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.939570 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.939594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.939612 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:12Z","lastTransitionTime":"2026-02-17T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.960181 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:12Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.978711 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:12Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:12 crc kubenswrapper[4717]: I0217 14:53:12.993898 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:12Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.007470 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:13Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.024698 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:13Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.039523 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:13Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.043116 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.043186 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.043199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.043220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.043234 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:13Z","lastTransitionTime":"2026-02-17T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.059686 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:13Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.083990 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:13Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.103272 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:13Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.124322 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:13Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.144051 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:13Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.146481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.146551 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.146575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.146602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.146620 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:13Z","lastTransitionTime":"2026-02-17T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.164900 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:13Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.183641 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:13Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.205410 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:06Z\\\",\\\"message\\\":\\\"twork=default: []services.LB{}\\\\nI0217 14:53:06.414305 6389 services_controller.go:454] Service openshift-machine-api/machine-api-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 14:53:06.413756 6389 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 14:53:06.414231 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:06.414283 6389 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI0217 14:53:06.413829 6389 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-xr7pz in node crc\\\\nI0217 14:53:06.414442 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-xr7pz after 0 failed attempt(s)\\\\nI0217 14:53:06.414448 6389 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-xr7pz\\\\nF0217 14:53:06.413704 6389 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:13Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.219772 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:13Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.250113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.250166 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.250176 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.250192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.250203 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:13Z","lastTransitionTime":"2026-02-17T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.353208 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.353507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.353653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.353804 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.353889 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:13Z","lastTransitionTime":"2026-02-17T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.456232 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.456266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.456277 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.456295 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.456305 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:13Z","lastTransitionTime":"2026-02-17T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.558521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.558571 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.558605 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.558626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.558640 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:13Z","lastTransitionTime":"2026-02-17T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.662148 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.662209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.662220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.662241 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.662255 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:13Z","lastTransitionTime":"2026-02-17T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.766244 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.766310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.766327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.766352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.766372 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:13Z","lastTransitionTime":"2026-02-17T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.839845 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 19:21:03.787221502 +0000 UTC Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.846417 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:13 crc kubenswrapper[4717]: E0217 14:53:13.846654 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.846458 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.846437 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:13 crc kubenswrapper[4717]: E0217 14:53:13.846772 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.846811 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:13 crc kubenswrapper[4717]: E0217 14:53:13.846874 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:13 crc kubenswrapper[4717]: E0217 14:53:13.846933 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.870434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.870493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.870510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.870573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.870617 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:13Z","lastTransitionTime":"2026-02-17T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.974026 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.974107 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.974119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.974141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:13 crc kubenswrapper[4717]: I0217 14:53:13.974155 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:13Z","lastTransitionTime":"2026-02-17T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.077392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.077459 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.077476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.077511 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.077532 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.180685 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.180758 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.180778 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.180807 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.180828 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.284246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.284309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.284325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.284361 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.284380 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.387615 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.387709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.387729 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.387763 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.387786 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.491600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.491668 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.491686 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.491714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.491735 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.595392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.595445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.595455 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.595479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.595492 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.699377 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.699491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.699513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.699545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.699570 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.740726 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.740795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.740814 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.740843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.740864 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: E0217 14:53:14.759785 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.763956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.763997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.764006 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.764024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.764524 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: E0217 14:53:14.781345 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.786400 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.786442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.786456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.786478 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.786492 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: E0217 14:53:14.805390 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.810891 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.810923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.810934 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.810953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.810964 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: E0217 14:53:14.831608 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.838672 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.838731 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.838744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.838767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.838780 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.840218 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 09:16:04.598480297 +0000 UTC Feb 17 14:53:14 crc kubenswrapper[4717]: E0217 14:53:14.853575 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:14 crc kubenswrapper[4717]: E0217 14:53:14.853824 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.855959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.856012 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.856029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.856050 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.856062 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.959339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.959376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.959385 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.959403 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:14 crc kubenswrapper[4717]: I0217 14:53:14.959414 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:14Z","lastTransitionTime":"2026-02-17T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.082335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.082377 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.082393 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.082409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.082424 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:15Z","lastTransitionTime":"2026-02-17T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.184699 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.184738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.184748 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.184762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.184775 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:15Z","lastTransitionTime":"2026-02-17T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.287863 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.287928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.287943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.287961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.287973 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:15Z","lastTransitionTime":"2026-02-17T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.391814 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.391871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.391887 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.391913 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.391929 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:15Z","lastTransitionTime":"2026-02-17T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.494880 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.494925 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.494938 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.494956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.494969 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:15Z","lastTransitionTime":"2026-02-17T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.598018 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.598064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.598097 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.598119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.598132 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:15Z","lastTransitionTime":"2026-02-17T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.700699 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.700741 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.700751 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.700767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.700778 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:15Z","lastTransitionTime":"2026-02-17T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.803681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.803763 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.803786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.803816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.803838 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:15Z","lastTransitionTime":"2026-02-17T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.840862 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 18:41:25.947015353 +0000 UTC Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.845808 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:15 crc kubenswrapper[4717]: E0217 14:53:15.846847 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.845902 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.845960 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.845834 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:15 crc kubenswrapper[4717]: E0217 14:53:15.849962 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:15 crc kubenswrapper[4717]: E0217 14:53:15.850120 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:15 crc kubenswrapper[4717]: E0217 14:53:15.850184 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.879965 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:06Z\\\",\\\"message\\\":\\\"twork=default: []services.LB{}\\\\nI0217 14:53:06.414305 6389 services_controller.go:454] Service openshift-machine-api/machine-api-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 14:53:06.413756 6389 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 14:53:06.414231 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:06.414283 6389 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI0217 14:53:06.413829 6389 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-xr7pz in node crc\\\\nI0217 14:53:06.414442 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-xr7pz after 0 failed attempt(s)\\\\nI0217 14:53:06.414448 6389 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-xr7pz\\\\nF0217 14:53:06.413704 6389 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:15Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.895857 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:15Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.906962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.907021 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.907033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.907054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.907067 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:15Z","lastTransitionTime":"2026-02-17T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.915858 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:15Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.933696 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:15Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.951476 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:15Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.965290 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:15Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.981023 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:15Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:15 crc kubenswrapper[4717]: I0217 14:53:15.996296 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:15Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.008962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.009008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.009018 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.009038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.009051 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:16Z","lastTransitionTime":"2026-02-17T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.009617 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.023860 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.039733 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.059220 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.079436 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.095149 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"014dab04-a429-4107-89b9-53a382431c40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a40a2833a2d7bc1d9e94e0101eadfa1bc418f4143d686f41810b26566671bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff43e83143a6518fea4cb55cdc369649446cd2e15b3a776ac3df5f04f65a1744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1bef80b531a17e335240f260db84a0ed014c484239aa9d94d136ddf0188d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.113603 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.113758 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.113808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.113821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.113845 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.113860 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:16Z","lastTransitionTime":"2026-02-17T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.127198 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.142470 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.217045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.217124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.217139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.217160 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.217207 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:16Z","lastTransitionTime":"2026-02-17T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.325283 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.325364 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.325376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.325395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.325406 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:16Z","lastTransitionTime":"2026-02-17T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.429315 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.429381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.429398 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.429425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.429442 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:16Z","lastTransitionTime":"2026-02-17T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.532796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.532876 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.532891 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.532915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.532934 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:16Z","lastTransitionTime":"2026-02-17T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.636253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.636301 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.636313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.636332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.636343 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:16Z","lastTransitionTime":"2026-02-17T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.742819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.742882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.742895 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.742915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.742929 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:16Z","lastTransitionTime":"2026-02-17T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.841565 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 02:08:12.856943314 +0000 UTC Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.846263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.846609 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.846678 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.846744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.846849 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:16Z","lastTransitionTime":"2026-02-17T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.950154 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.950211 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.950231 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.950258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:16 crc kubenswrapper[4717]: I0217 14:53:16.950276 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:16Z","lastTransitionTime":"2026-02-17T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.052982 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.053026 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.053037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.053057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.053070 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:17Z","lastTransitionTime":"2026-02-17T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.156804 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.156900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.156917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.156973 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.156991 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:17Z","lastTransitionTime":"2026-02-17T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.259801 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.259879 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.259899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.259927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.259950 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:17Z","lastTransitionTime":"2026-02-17T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.363030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.363090 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.363100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.363116 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.363125 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:17Z","lastTransitionTime":"2026-02-17T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.465173 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.465255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.465264 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.465286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.465298 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:17Z","lastTransitionTime":"2026-02-17T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.567754 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.567811 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.567822 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.567850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.567864 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:17Z","lastTransitionTime":"2026-02-17T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.670995 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.671060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.671101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.671128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.671150 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:17Z","lastTransitionTime":"2026-02-17T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.774448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.774519 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.774536 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.774564 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.774582 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:17Z","lastTransitionTime":"2026-02-17T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.842231 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 22:42:25.721362634 +0000 UTC Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.845768 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.845811 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.845846 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:17 crc kubenswrapper[4717]: E0217 14:53:17.845960 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:17 crc kubenswrapper[4717]: E0217 14:53:17.846129 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.846223 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:17 crc kubenswrapper[4717]: E0217 14:53:17.846401 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:17 crc kubenswrapper[4717]: E0217 14:53:17.846505 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.877999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.878112 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.878128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.878150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.878163 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:17Z","lastTransitionTime":"2026-02-17T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.981278 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.981362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.981384 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.981415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:17 crc kubenswrapper[4717]: I0217 14:53:17.981440 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:17Z","lastTransitionTime":"2026-02-17T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.084341 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.084407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.084426 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.084454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.084471 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:18Z","lastTransitionTime":"2026-02-17T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.187487 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.187538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.187552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.187574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.187588 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:18Z","lastTransitionTime":"2026-02-17T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.290164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.290243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.290267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.290301 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.290323 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:18Z","lastTransitionTime":"2026-02-17T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.393193 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.393226 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.393235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.393253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.393260 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:18Z","lastTransitionTime":"2026-02-17T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.496539 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.496566 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.496574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.496588 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.496597 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:18Z","lastTransitionTime":"2026-02-17T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.599565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.599642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.599704 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.599738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.599761 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:18Z","lastTransitionTime":"2026-02-17T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.703039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.703142 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.703180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.703199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.703215 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:18Z","lastTransitionTime":"2026-02-17T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.806657 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.806705 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.806718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.806743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.806758 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:18Z","lastTransitionTime":"2026-02-17T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.842824 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 10:53:40.296465476 +0000 UTC Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.910678 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.910759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.910775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.910799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:18 crc kubenswrapper[4717]: I0217 14:53:18.910813 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:18Z","lastTransitionTime":"2026-02-17T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.013237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.013306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.013323 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.013350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.013369 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:19Z","lastTransitionTime":"2026-02-17T14:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.115981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.116015 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.116024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.116038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.116048 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:19Z","lastTransitionTime":"2026-02-17T14:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.218741 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.218839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.218856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.218918 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.218939 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:19Z","lastTransitionTime":"2026-02-17T14:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.322195 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.322244 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.322253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.322270 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.322282 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:19Z","lastTransitionTime":"2026-02-17T14:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.425062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.425144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.425158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.425179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.425192 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:19Z","lastTransitionTime":"2026-02-17T14:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.528443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.528483 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.528494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.528514 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.528527 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:19Z","lastTransitionTime":"2026-02-17T14:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.631451 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.631487 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.631495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.631510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.631518 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:19Z","lastTransitionTime":"2026-02-17T14:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.734407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.734458 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.734470 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.734489 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.734503 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:19Z","lastTransitionTime":"2026-02-17T14:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.837390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.837502 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.837518 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.837544 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.837562 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:19Z","lastTransitionTime":"2026-02-17T14:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.843463 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 14:05:36.13006975 +0000 UTC Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.846227 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.846358 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:19 crc kubenswrapper[4717]: E0217 14:53:19.846485 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.846547 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.846600 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.846722 4717 scope.go:117] "RemoveContainer" containerID="b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322" Feb 17 14:53:19 crc kubenswrapper[4717]: E0217 14:53:19.846952 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:19 crc kubenswrapper[4717]: E0217 14:53:19.847035 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" Feb 17 14:53:19 crc kubenswrapper[4717]: E0217 14:53:19.846719 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:19 crc kubenswrapper[4717]: E0217 14:53:19.847150 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.940701 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.940755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.940770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.940788 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:19 crc kubenswrapper[4717]: I0217 14:53:19.940801 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:19Z","lastTransitionTime":"2026-02-17T14:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.043738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.043788 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.043799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.043819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.043835 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:20Z","lastTransitionTime":"2026-02-17T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.148493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.148997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.149011 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.149032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.149046 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:20Z","lastTransitionTime":"2026-02-17T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.252933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.253030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.253052 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.253124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.253162 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:20Z","lastTransitionTime":"2026-02-17T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.357388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.357470 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.357490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.357519 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.357540 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:20Z","lastTransitionTime":"2026-02-17T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.460721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.460810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.460831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.460864 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.460886 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:20Z","lastTransitionTime":"2026-02-17T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.564714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.564787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.564802 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.564824 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.564840 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:20Z","lastTransitionTime":"2026-02-17T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.668816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.668873 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.668889 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.668912 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.668928 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:20Z","lastTransitionTime":"2026-02-17T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.771535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.771587 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.771599 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.771616 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.771630 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:20Z","lastTransitionTime":"2026-02-17T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.844174 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 23:38:48.72894265 +0000 UTC Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.862214 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.874110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.874149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.874159 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.874175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.874186 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:20Z","lastTransitionTime":"2026-02-17T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.976339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.976370 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.976378 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.976392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:20 crc kubenswrapper[4717]: I0217 14:53:20.976404 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:20Z","lastTransitionTime":"2026-02-17T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.078897 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.078936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.078947 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.078960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.078969 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:21Z","lastTransitionTime":"2026-02-17T14:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.181501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.181546 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.181555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.181572 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.181582 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:21Z","lastTransitionTime":"2026-02-17T14:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.284374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.284417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.284429 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.284446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.284457 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:21Z","lastTransitionTime":"2026-02-17T14:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.386838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.386884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.386893 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.386909 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.386917 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:21Z","lastTransitionTime":"2026-02-17T14:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.489054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.489131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.489144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.489161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.489515 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:21Z","lastTransitionTime":"2026-02-17T14:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.592407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.592670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.592685 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.592702 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.592712 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:21Z","lastTransitionTime":"2026-02-17T14:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.694894 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.694926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.694935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.694950 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.694958 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:21Z","lastTransitionTime":"2026-02-17T14:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.797318 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.797397 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.797411 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.797456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.797469 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:21Z","lastTransitionTime":"2026-02-17T14:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.845294 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 13:55:12.487635713 +0000 UTC Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.849215 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:21 crc kubenswrapper[4717]: E0217 14:53:21.849348 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.849799 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:21 crc kubenswrapper[4717]: E0217 14:53:21.849869 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.849923 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:21 crc kubenswrapper[4717]: E0217 14:53:21.849978 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.850025 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:21 crc kubenswrapper[4717]: E0217 14:53:21.850167 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.900067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.900135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.900144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.900196 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:21 crc kubenswrapper[4717]: I0217 14:53:21.900208 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:21Z","lastTransitionTime":"2026-02-17T14:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.003546 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.003611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.003625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.003645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.003663 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:22Z","lastTransitionTime":"2026-02-17T14:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.107450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.107496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.107507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.107525 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.107539 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:22Z","lastTransitionTime":"2026-02-17T14:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.209433 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.209480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.209497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.209518 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.209531 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:22Z","lastTransitionTime":"2026-02-17T14:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.312179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.312214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.312224 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.312241 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.312251 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:22Z","lastTransitionTime":"2026-02-17T14:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.414792 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.414831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.414844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.414863 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.414874 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:22Z","lastTransitionTime":"2026-02-17T14:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.518373 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.518435 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.518453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.518476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.518494 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:22Z","lastTransitionTime":"2026-02-17T14:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.620897 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.620956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.620972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.621017 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.621028 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:22Z","lastTransitionTime":"2026-02-17T14:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.724031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.724099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.724112 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.724130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.724144 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:22Z","lastTransitionTime":"2026-02-17T14:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.827810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.827864 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.827878 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.827927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.827941 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:22Z","lastTransitionTime":"2026-02-17T14:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.845931 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 07:24:28.579747176 +0000 UTC Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.931101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.931144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.931158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.931177 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:22 crc kubenswrapper[4717]: I0217 14:53:22.931190 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:22Z","lastTransitionTime":"2026-02-17T14:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.034983 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.035044 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.035056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.035080 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.035127 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:23Z","lastTransitionTime":"2026-02-17T14:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.137962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.138038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.138133 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.138166 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.138186 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:23Z","lastTransitionTime":"2026-02-17T14:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.241367 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.241415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.241427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.241453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.241470 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:23Z","lastTransitionTime":"2026-02-17T14:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.344376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.344432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.344445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.344464 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.344477 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:23Z","lastTransitionTime":"2026-02-17T14:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.447542 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.447614 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.447623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.447640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.447651 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:23Z","lastTransitionTime":"2026-02-17T14:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.549943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.549990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.550001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.550020 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.550032 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:23Z","lastTransitionTime":"2026-02-17T14:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.653106 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.653174 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.653192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.653214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.653227 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:23Z","lastTransitionTime":"2026-02-17T14:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.755871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.755943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.755965 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.755986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.755999 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:23Z","lastTransitionTime":"2026-02-17T14:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.846131 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 01:02:58.574026439 +0000 UTC Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.846260 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.846298 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.846305 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.846358 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:23 crc kubenswrapper[4717]: E0217 14:53:23.846432 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:23 crc kubenswrapper[4717]: E0217 14:53:23.846570 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:23 crc kubenswrapper[4717]: E0217 14:53:23.846711 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:23 crc kubenswrapper[4717]: E0217 14:53:23.846828 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.858748 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.858821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.858843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.858872 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.858895 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:23Z","lastTransitionTime":"2026-02-17T14:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.961620 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.961669 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.961681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.961698 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:23 crc kubenswrapper[4717]: I0217 14:53:23.961709 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:23Z","lastTransitionTime":"2026-02-17T14:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.064913 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.064974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.064988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.065013 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.065026 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.171808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.171860 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.171875 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.171892 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.171907 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.275249 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.275297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.275306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.275325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.275335 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.378127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.378178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.378192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.378215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.378235 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.480235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.480277 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.480286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.480305 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.480315 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.582957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.583041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.583060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.583119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.583149 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.685811 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.685849 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.685889 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.685910 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.685923 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.789361 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.789423 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.789434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.789452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.789466 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.847131 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 18:43:21.055338272 +0000 UTC Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.892184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.892214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.892223 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.892239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.892249 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.897880 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.897913 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.897924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.897937 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.897945 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: E0217 14:53:24.909640 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.913740 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.913793 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.913830 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.913844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.913853 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: E0217 14:53:24.927016 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.930759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.930797 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.930806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.930827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.930844 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: E0217 14:53:24.945723 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.950353 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.950443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.950465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.950502 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.950524 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: E0217 14:53:24.965929 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.970994 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.971058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.971074 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.971116 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.971131 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:24 crc kubenswrapper[4717]: E0217 14:53:24.989611 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:24 crc kubenswrapper[4717]: E0217 14:53:24.989734 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.994696 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.994743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.994752 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.994772 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:24 crc kubenswrapper[4717]: I0217 14:53:24.994783 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:24Z","lastTransitionTime":"2026-02-17T14:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.097999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.098050 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.098061 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.098100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.098112 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:25Z","lastTransitionTime":"2026-02-17T14:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.200957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.201446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.201478 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.201498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.201516 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:25Z","lastTransitionTime":"2026-02-17T14:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.304219 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.304263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.304277 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.304294 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.304309 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:25Z","lastTransitionTime":"2026-02-17T14:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.408495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.408558 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.408575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.408596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.408614 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:25Z","lastTransitionTime":"2026-02-17T14:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.511891 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.511962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.511980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.512003 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.512020 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:25Z","lastTransitionTime":"2026-02-17T14:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.614552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.614648 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.614663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.614690 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.614707 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:25Z","lastTransitionTime":"2026-02-17T14:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.717619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.717673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.717684 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.717704 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.717716 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:25Z","lastTransitionTime":"2026-02-17T14:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.775535 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:25 crc kubenswrapper[4717]: E0217 14:53:25.775714 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:53:25 crc kubenswrapper[4717]: E0217 14:53:25.775787 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs podName:f31d30c1-1e4a-49d3-adef-767a88616f33 nodeName:}" failed. No retries permitted until 2026-02-17 14:53:57.775768964 +0000 UTC m=+104.191609430 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs") pod "network-metrics-daemon-pzb78" (UID: "f31d30c1-1e4a-49d3-adef-767a88616f33") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.821459 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.821532 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.821541 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.821575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.821590 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:25Z","lastTransitionTime":"2026-02-17T14:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.845814 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.845982 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:25 crc kubenswrapper[4717]: E0217 14:53:25.846074 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.846106 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:25 crc kubenswrapper[4717]: E0217 14:53:25.846248 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:25 crc kubenswrapper[4717]: E0217 14:53:25.846477 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.845868 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:25 crc kubenswrapper[4717]: E0217 14:53:25.847277 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.847393 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 09:03:47.389396713 +0000 UTC Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.872391 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.888539 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.904372 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.917584 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.926298 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.926339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.926539 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.926870 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.926975 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:25Z","lastTransitionTime":"2026-02-17T14:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.931603 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.949577 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.963603 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"014dab04-a429-4107-89b9-53a382431c40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a40a2833a2d7bc1d9e94e0101eadfa1bc418f4143d686f41810b26566671bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff43e83143a6518fea4cb55cdc369649446cd2e15b3a776ac3df5f04f65a1744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1bef80b531a17e335240f260db84a0ed014c484239aa9d94d136ddf0188d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.976306 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe92c1f-25d7-43f8-8644-6295e9c25321\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e9be80a82dc2fb9996d26cdf005ecf8ac03a19141cac4a5928d632cfc1e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:25 crc kubenswrapper[4717]: I0217 14:53:25.992380 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.005493 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.019588 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.030759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.030831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.030844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.030863 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.030875 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:26Z","lastTransitionTime":"2026-02-17T14:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.035011 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.046832 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.061811 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.077277 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.094538 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.109679 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.131588 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:06Z\\\",\\\"message\\\":\\\"twork=default: []services.LB{}\\\\nI0217 14:53:06.414305 6389 services_controller.go:454] Service openshift-machine-api/machine-api-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 14:53:06.413756 6389 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 14:53:06.414231 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:06.414283 6389 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI0217 14:53:06.413829 6389 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-xr7pz in node crc\\\\nI0217 14:53:06.414442 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-xr7pz after 0 failed attempt(s)\\\\nI0217 14:53:06.414448 6389 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-xr7pz\\\\nF0217 14:53:06.413704 6389 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.133440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.133478 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.133488 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.133507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.133519 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:26Z","lastTransitionTime":"2026-02-17T14:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.236337 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.236405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.236417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.236454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.236468 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:26Z","lastTransitionTime":"2026-02-17T14:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.339374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.339450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.339468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.339497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.339516 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:26Z","lastTransitionTime":"2026-02-17T14:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.444900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.444959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.444969 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.444988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.444997 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:26Z","lastTransitionTime":"2026-02-17T14:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.547993 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.548036 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.548045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.548062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.548072 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:26Z","lastTransitionTime":"2026-02-17T14:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.651395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.651462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.651478 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.651500 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.651515 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:26Z","lastTransitionTime":"2026-02-17T14:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.754885 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.754946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.754960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.754982 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.754996 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:26Z","lastTransitionTime":"2026-02-17T14:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.857865 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 00:57:34.282552573 +0000 UTC Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.858269 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.858320 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.858331 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.858349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.858361 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:26Z","lastTransitionTime":"2026-02-17T14:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.961522 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.961626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.961639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.961658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:26 crc kubenswrapper[4717]: I0217 14:53:26.961670 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:26Z","lastTransitionTime":"2026-02-17T14:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.064395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.064439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.064450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.064466 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.064477 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:27Z","lastTransitionTime":"2026-02-17T14:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.166732 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.166784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.166793 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.166810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.166819 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:27Z","lastTransitionTime":"2026-02-17T14:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.269411 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.269486 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.269496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.269515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.269524 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:27Z","lastTransitionTime":"2026-02-17T14:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.372115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.372150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.372159 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.372175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.372184 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:27Z","lastTransitionTime":"2026-02-17T14:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.475675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.475743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.475759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.475785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.475799 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:27Z","lastTransitionTime":"2026-02-17T14:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.578217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.578263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.578279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.578297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.578308 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:27Z","lastTransitionTime":"2026-02-17T14:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.681384 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.681431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.681441 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.681463 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.681473 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:27Z","lastTransitionTime":"2026-02-17T14:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.784102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.784604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.784746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.785013 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.785164 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:27Z","lastTransitionTime":"2026-02-17T14:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.847018 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.847316 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.847136 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.847194 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:27 crc kubenswrapper[4717]: E0217 14:53:27.847492 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:27 crc kubenswrapper[4717]: E0217 14:53:27.847698 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:27 crc kubenswrapper[4717]: E0217 14:53:27.847789 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:27 crc kubenswrapper[4717]: E0217 14:53:27.847866 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.858435 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 14:11:54.804504533 +0000 UTC Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.888157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.888213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.888232 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.888256 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.888276 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:27Z","lastTransitionTime":"2026-02-17T14:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.991358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.991422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.991438 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.991465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:27 crc kubenswrapper[4717]: I0217 14:53:27.991483 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:27Z","lastTransitionTime":"2026-02-17T14:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.094282 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.094340 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.094350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.094369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.094381 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:28Z","lastTransitionTime":"2026-02-17T14:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.197741 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.197804 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.197821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.197845 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.197863 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:28Z","lastTransitionTime":"2026-02-17T14:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.300756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.300813 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.300827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.300856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.300872 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:28Z","lastTransitionTime":"2026-02-17T14:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.319350 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nfvrt_3daa865c-6e58-4512-9be1-5d3a490a2f7a/kube-multus/0.log" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.319453 4717 generic.go:334] "Generic (PLEG): container finished" podID="3daa865c-6e58-4512-9be1-5d3a490a2f7a" containerID="fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557" exitCode=1 Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.319516 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nfvrt" event={"ID":"3daa865c-6e58-4512-9be1-5d3a490a2f7a","Type":"ContainerDied","Data":"fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557"} Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.320263 4717 scope.go:117] "RemoveContainer" containerID="fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.337125 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.354798 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.372544 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.389355 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.402612 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"014dab04-a429-4107-89b9-53a382431c40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a40a2833a2d7bc1d9e94e0101eadfa1bc418f4143d686f41810b26566671bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff43e83143a6518fea4cb55cdc369649446cd2e15b3a776ac3df5f04f65a1744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1bef80b531a17e335240f260db84a0ed014c484239aa9d94d136ddf0188d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.404569 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.404743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.404847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.404949 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.405052 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:28Z","lastTransitionTime":"2026-02-17T14:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.416567 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe92c1f-25d7-43f8-8644-6295e9c25321\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e9be80a82dc2fb9996d26cdf005ecf8ac03a19141cac4a5928d632cfc1e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.429610 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.444226 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.458934 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.473114 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:27Z\\\",\\\"message\\\":\\\"2026-02-17T14:52:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4323f213-a8c5-44d2-aeec-3a86431d8c0a\\\\n2026-02-17T14:52:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4323f213-a8c5-44d2-aeec-3a86431d8c0a to /host/opt/cni/bin/\\\\n2026-02-17T14:52:42Z [verbose] multus-daemon started\\\\n2026-02-17T14:52:42Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:53:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.487334 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.501870 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.507348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.507401 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.507414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.507432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.507446 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:28Z","lastTransitionTime":"2026-02-17T14:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.520958 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.540181 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.558400 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.572377 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.610732 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.611181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.611257 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.611332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.611414 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:28Z","lastTransitionTime":"2026-02-17T14:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.613673 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:06Z\\\",\\\"message\\\":\\\"twork=default: []services.LB{}\\\\nI0217 14:53:06.414305 6389 services_controller.go:454] Service openshift-machine-api/machine-api-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 14:53:06.413756 6389 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 14:53:06.414231 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:06.414283 6389 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI0217 14:53:06.413829 6389 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-xr7pz in node crc\\\\nI0217 14:53:06.414442 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-xr7pz after 0 failed attempt(s)\\\\nI0217 14:53:06.414448 6389 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-xr7pz\\\\nF0217 14:53:06.413704 6389 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.627606 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.715011 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.715110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.715129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.715156 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.715175 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:28Z","lastTransitionTime":"2026-02-17T14:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.819248 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.819297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.819309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.819336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.819350 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:28Z","lastTransitionTime":"2026-02-17T14:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.859238 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 23:30:24.884230714 +0000 UTC Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.922927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.922986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.923001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.923023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:28 crc kubenswrapper[4717]: I0217 14:53:28.923038 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:28Z","lastTransitionTime":"2026-02-17T14:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.025384 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.025509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.025540 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.025576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.025601 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:29Z","lastTransitionTime":"2026-02-17T14:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.128617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.128677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.128695 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.128722 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.128737 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:29Z","lastTransitionTime":"2026-02-17T14:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.231789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.231851 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.231863 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.231883 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.231898 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:29Z","lastTransitionTime":"2026-02-17T14:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.325644 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nfvrt_3daa865c-6e58-4512-9be1-5d3a490a2f7a/kube-multus/0.log" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.325721 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nfvrt" event={"ID":"3daa865c-6e58-4512-9be1-5d3a490a2f7a","Type":"ContainerStarted","Data":"01f9e46aae86320f6c1bcc534e3cbe373ef0df5082d7386e2e382d1c60228ca6"} Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.335666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.335724 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.335738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.335758 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.335773 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:29Z","lastTransitionTime":"2026-02-17T14:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.344863 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.360292 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.375885 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.390973 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"014dab04-a429-4107-89b9-53a382431c40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a40a2833a2d7bc1d9e94e0101eadfa1bc418f4143d686f41810b26566671bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff43e83143a6518fea4cb55cdc369649446cd2e15b3a776ac3df5f04f65a1744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1bef80b531a17e335240f260db84a0ed014c484239aa9d94d136ddf0188d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.406393 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe92c1f-25d7-43f8-8644-6295e9c25321\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e9be80a82dc2fb9996d26cdf005ecf8ac03a19141cac4a5928d632cfc1e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.420134 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.432182 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.438615 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.438660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.438672 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.438688 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.438700 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:29Z","lastTransitionTime":"2026-02-17T14:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.445525 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.461237 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f9e46aae86320f6c1bcc534e3cbe373ef0df5082d7386e2e382d1c60228ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:27Z\\\",\\\"message\\\":\\\"2026-02-17T14:52:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4323f213-a8c5-44d2-aeec-3a86431d8c0a\\\\n2026-02-17T14:52:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4323f213-a8c5-44d2-aeec-3a86431d8c0a to /host/opt/cni/bin/\\\\n2026-02-17T14:52:42Z [verbose] multus-daemon started\\\\n2026-02-17T14:52:42Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:53:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.477222 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.488414 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.501058 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.515690 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.531573 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.541822 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.541868 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.541898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.541918 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.541933 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:29Z","lastTransitionTime":"2026-02-17T14:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.551647 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.568975 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.599519 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:06Z\\\",\\\"message\\\":\\\"twork=default: []services.LB{}\\\\nI0217 14:53:06.414305 6389 services_controller.go:454] Service openshift-machine-api/machine-api-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 14:53:06.413756 6389 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 14:53:06.414231 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:06.414283 6389 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI0217 14:53:06.413829 6389 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-xr7pz in node crc\\\\nI0217 14:53:06.414442 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-xr7pz after 0 failed attempt(s)\\\\nI0217 14:53:06.414448 6389 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-xr7pz\\\\nF0217 14:53:06.413704 6389 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.613192 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.644467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.644512 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.644527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.644545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.644561 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:29Z","lastTransitionTime":"2026-02-17T14:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.747327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.747400 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.747425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.747455 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.747525 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:29Z","lastTransitionTime":"2026-02-17T14:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.846158 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:29 crc kubenswrapper[4717]: E0217 14:53:29.846347 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.846480 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:29 crc kubenswrapper[4717]: E0217 14:53:29.846542 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.846601 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:29 crc kubenswrapper[4717]: E0217 14:53:29.846654 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.846715 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:29 crc kubenswrapper[4717]: E0217 14:53:29.846769 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.850681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.850717 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.850729 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.850744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.850757 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:29Z","lastTransitionTime":"2026-02-17T14:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.860230 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 15:40:12.676147184 +0000 UTC Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.954787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.954830 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.954846 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.954869 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:29 crc kubenswrapper[4717]: I0217 14:53:29.954898 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:29Z","lastTransitionTime":"2026-02-17T14:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.058483 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.058559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.058576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.058601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.058622 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:30Z","lastTransitionTime":"2026-02-17T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.161466 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.161531 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.161544 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.161566 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.161583 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:30Z","lastTransitionTime":"2026-02-17T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.264709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.264766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.264780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.264798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.264815 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:30Z","lastTransitionTime":"2026-02-17T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.367704 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.367794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.367818 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.367851 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.367880 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:30Z","lastTransitionTime":"2026-02-17T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.471504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.471612 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.471630 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.471657 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.471676 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:30Z","lastTransitionTime":"2026-02-17T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.575208 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.575291 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.575316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.575348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.575374 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:30Z","lastTransitionTime":"2026-02-17T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.678075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.678181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.678195 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.678215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.678228 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:30Z","lastTransitionTime":"2026-02-17T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.781622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.781685 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.781701 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.781725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.781747 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:30Z","lastTransitionTime":"2026-02-17T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.860861 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 19:32:45.733358828 +0000 UTC Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.885308 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.885371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.885389 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.885411 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.885429 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:30Z","lastTransitionTime":"2026-02-17T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.989038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.989152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.989175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.989203 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:30 crc kubenswrapper[4717]: I0217 14:53:30.989222 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:30Z","lastTransitionTime":"2026-02-17T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.092469 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.092532 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.092550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.092573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.092589 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:31Z","lastTransitionTime":"2026-02-17T14:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.196138 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.196190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.196208 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.196231 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.196246 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:31Z","lastTransitionTime":"2026-02-17T14:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.299299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.299672 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.299917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.300027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.300169 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:31Z","lastTransitionTime":"2026-02-17T14:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.404233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.404648 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.404803 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.404948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.405204 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:31Z","lastTransitionTime":"2026-02-17T14:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.508383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.508665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.508790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.508884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.508970 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:31Z","lastTransitionTime":"2026-02-17T14:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.612968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.613008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.613021 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.613039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.613052 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:31Z","lastTransitionTime":"2026-02-17T14:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.717343 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.717682 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.717854 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.717976 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.718141 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:31Z","lastTransitionTime":"2026-02-17T14:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.821504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.821891 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.822153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.822415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.822587 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:31Z","lastTransitionTime":"2026-02-17T14:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.847273 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.847360 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.847436 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:31 crc kubenswrapper[4717]: E0217 14:53:31.847762 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:31 crc kubenswrapper[4717]: E0217 14:53:31.848401 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.848610 4717 scope.go:117] "RemoveContainer" containerID="b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.848644 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:31 crc kubenswrapper[4717]: E0217 14:53:31.848480 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:31 crc kubenswrapper[4717]: E0217 14:53:31.849319 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.861394 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 04:34:49.147911607 +0000 UTC Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.925665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.926173 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.926266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.926413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:31 crc kubenswrapper[4717]: I0217 14:53:31.926547 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:31Z","lastTransitionTime":"2026-02-17T14:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.030948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.031009 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.031023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.031043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.031057 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:32Z","lastTransitionTime":"2026-02-17T14:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.133440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.133493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.133502 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.133519 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.133530 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:32Z","lastTransitionTime":"2026-02-17T14:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.237043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.237124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.237136 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.237157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.237172 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:32Z","lastTransitionTime":"2026-02-17T14:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.339155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.339204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.339217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.339237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.339251 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:32Z","lastTransitionTime":"2026-02-17T14:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.340934 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/2.log" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.345041 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b"} Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.345650 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.364309 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.380027 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.401135 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.422964 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe92c1f-25d7-43f8-8644-6295e9c25321\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e9be80a82dc2fb9996d26cdf005ecf8ac03a19141cac4a5928d632cfc1e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.442768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.442827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.442842 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.442864 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.442884 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:32Z","lastTransitionTime":"2026-02-17T14:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.448522 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.465156 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.482301 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.502050 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f9e46aae86320f6c1bcc534e3cbe373ef0df5082d7386e2e382d1c60228ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:27Z\\\",\\\"message\\\":\\\"2026-02-17T14:52:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4323f213-a8c5-44d2-aeec-3a86431d8c0a\\\\n2026-02-17T14:52:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4323f213-a8c5-44d2-aeec-3a86431d8c0a to /host/opt/cni/bin/\\\\n2026-02-17T14:52:42Z [verbose] multus-daemon started\\\\n2026-02-17T14:52:42Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:53:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.530531 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.545867 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.545932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.545946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.545975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.546006 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:32Z","lastTransitionTime":"2026-02-17T14:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.547803 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"014dab04-a429-4107-89b9-53a382431c40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a40a2833a2d7bc1d9e94e0101eadfa1bc418f4143d686f41810b26566671bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff43e83143a6518fea4cb55cdc369649446cd2e15b3a776ac3df5f04f65a1744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1bef80b531a17e335240f260db84a0ed014c484239aa9d94d136ddf0188d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.567301 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.591452 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.606128 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.621997 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.636653 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.649530 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.649590 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.649602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.649622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.649636 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:32Z","lastTransitionTime":"2026-02-17T14:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.665035 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:06Z\\\",\\\"message\\\":\\\"twork=default: []services.LB{}\\\\nI0217 14:53:06.414305 6389 services_controller.go:454] Service openshift-machine-api/machine-api-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 14:53:06.413756 6389 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 14:53:06.414231 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:06.414283 6389 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI0217 14:53:06.413829 6389 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-xr7pz in node crc\\\\nI0217 14:53:06.414442 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-xr7pz after 0 failed attempt(s)\\\\nI0217 14:53:06.414448 6389 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-xr7pz\\\\nF0217 14:53:06.413704 6389 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:53:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.681386 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.700807 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.753495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.753561 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.753575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.753598 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.753614 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:32Z","lastTransitionTime":"2026-02-17T14:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.857485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.857552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.857564 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.857584 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.857596 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:32Z","lastTransitionTime":"2026-02-17T14:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.861728 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 11:57:39.919817575 +0000 UTC Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.960509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.960562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.960571 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.960589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:32 crc kubenswrapper[4717]: I0217 14:53:32.960604 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:32Z","lastTransitionTime":"2026-02-17T14:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.064511 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.064613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.064631 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.064653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.064669 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:33Z","lastTransitionTime":"2026-02-17T14:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.168175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.168299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.168326 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.168358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.168383 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:33Z","lastTransitionTime":"2026-02-17T14:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.271612 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.271680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.271697 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.271722 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.271740 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:33Z","lastTransitionTime":"2026-02-17T14:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.353499 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/3.log" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.354896 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/2.log" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.360162 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b" exitCode=1 Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.360237 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b"} Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.360305 4717 scope.go:117] "RemoveContainer" containerID="b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.363414 4717 scope.go:117] "RemoveContainer" containerID="19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b" Feb 17 14:53:33 crc kubenswrapper[4717]: E0217 14:53:33.365283 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.377000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.377051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.377071 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.377127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.377188 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:33Z","lastTransitionTime":"2026-02-17T14:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.389590 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.413444 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.443593 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.471313 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.480955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.481027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.481129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.481163 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.481192 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:33Z","lastTransitionTime":"2026-02-17T14:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.489435 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.512252 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5670a1ac4ce15f16c95e8c4c562f85519351e69a891409f9c46e586200dc322\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:06Z\\\",\\\"message\\\":\\\"twork=default: []services.LB{}\\\\nI0217 14:53:06.414305 6389 services_controller.go:454] Service openshift-machine-api/machine-api-operator for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 14:53:06.413756 6389 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 14:53:06.414231 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:06.414283 6389 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI0217 14:53:06.413829 6389 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-xr7pz in node crc\\\\nI0217 14:53:06.414442 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-xr7pz after 0 failed attempt(s)\\\\nI0217 14:53:06.414448 6389 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-xr7pz\\\\nF0217 14:53:06.413704 6389 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:32Z\\\",\\\"message\\\":\\\"work-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/iptables-alerter-4ln5h]\\\\nI0217 14:53:32.965844 6786 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0217 14:53:32.965887 6786 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:32.965924 6786 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:32.965955 6786 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0217 14:53:32.965979 6786 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0217 14:53:32.966000 6786 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:32.966034 6786 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 14:53:32.966201 6786 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.530624 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.552822 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.572537 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.584073 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.584153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.584170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.584198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.584216 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:33Z","lastTransitionTime":"2026-02-17T14:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.588059 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.605836 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f9e46aae86320f6c1bcc534e3cbe373ef0df5082d7386e2e382d1c60228ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:27Z\\\",\\\"message\\\":\\\"2026-02-17T14:52:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4323f213-a8c5-44d2-aeec-3a86431d8c0a\\\\n2026-02-17T14:52:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4323f213-a8c5-44d2-aeec-3a86431d8c0a to /host/opt/cni/bin/\\\\n2026-02-17T14:52:42Z [verbose] multus-daemon started\\\\n2026-02-17T14:52:42Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:53:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.621480 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.635662 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"014dab04-a429-4107-89b9-53a382431c40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a40a2833a2d7bc1d9e94e0101eadfa1bc418f4143d686f41810b26566671bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff43e83143a6518fea4cb55cdc369649446cd2e15b3a776ac3df5f04f65a1744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1bef80b531a17e335240f260db84a0ed014c484239aa9d94d136ddf0188d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.651650 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe92c1f-25d7-43f8-8644-6295e9c25321\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e9be80a82dc2fb9996d26cdf005ecf8ac03a19141cac4a5928d632cfc1e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.669281 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.681658 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.686591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.686661 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.686680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.686705 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.686723 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:33Z","lastTransitionTime":"2026-02-17T14:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.695528 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.710835 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.789456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.789494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.789503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.789540 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.789552 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:33Z","lastTransitionTime":"2026-02-17T14:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.847706 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.847802 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.847873 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.847743 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:33 crc kubenswrapper[4717]: E0217 14:53:33.847939 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:33 crc kubenswrapper[4717]: E0217 14:53:33.848098 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:33 crc kubenswrapper[4717]: E0217 14:53:33.848187 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:33 crc kubenswrapper[4717]: E0217 14:53:33.848250 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.862540 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 09:28:57.163162805 +0000 UTC Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.872884 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.892690 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.892738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.892750 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.892770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:33 crc kubenswrapper[4717]: I0217 14:53:33.892785 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:33Z","lastTransitionTime":"2026-02-17T14:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.003571 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.003646 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.003665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.003692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.003713 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:34Z","lastTransitionTime":"2026-02-17T14:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.111804 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.111879 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.111900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.111936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.111963 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:34Z","lastTransitionTime":"2026-02-17T14:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.215754 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.215828 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.215847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.215875 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.215893 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:34Z","lastTransitionTime":"2026-02-17T14:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.318962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.319019 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.319038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.319064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.319129 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:34Z","lastTransitionTime":"2026-02-17T14:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.368557 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/3.log" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.375181 4717 scope.go:117] "RemoveContainer" containerID="19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b" Feb 17 14:53:34 crc kubenswrapper[4717]: E0217 14:53:34.375944 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.402997 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.422376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.422484 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.422512 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.422549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.422569 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:34Z","lastTransitionTime":"2026-02-17T14:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.424300 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.445447 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.464160 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.486812 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f9e46aae86320f6c1bcc534e3cbe373ef0df5082d7386e2e382d1c60228ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:27Z\\\",\\\"message\\\":\\\"2026-02-17T14:52:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4323f213-a8c5-44d2-aeec-3a86431d8c0a\\\\n2026-02-17T14:52:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4323f213-a8c5-44d2-aeec-3a86431d8c0a to /host/opt/cni/bin/\\\\n2026-02-17T14:52:42Z [verbose] multus-daemon started\\\\n2026-02-17T14:52:42Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:53:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.503547 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.522047 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"014dab04-a429-4107-89b9-53a382431c40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a40a2833a2d7bc1d9e94e0101eadfa1bc418f4143d686f41810b26566671bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff43e83143a6518fea4cb55cdc369649446cd2e15b3a776ac3df5f04f65a1744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1bef80b531a17e335240f260db84a0ed014c484239aa9d94d136ddf0188d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.525021 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.525073 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.525118 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.525146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.525163 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:34Z","lastTransitionTime":"2026-02-17T14:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.540181 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe92c1f-25d7-43f8-8644-6295e9c25321\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e9be80a82dc2fb9996d26cdf005ecf8ac03a19141cac4a5928d632cfc1e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.579238 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b33a42a-1af6-4395-9331-d7aa78e0f2a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00b9dbac5f1b401bd29895d158a6375875fde57062dd9a9e3d50b662a4fd8c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96469e1b3a3e0c2a7a131f109884c7f6d1c4f72a8959c15f308ff276883687e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://849cfdcc83a8c65d74eaa79ec65157725c36f10f879a0e48aa3b9a9483886652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a24c45d759a685174f2796c6b4693de58dfdb082ccbc9b889857572694512ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a093154833c54fe11fcf8321bba7d1e5fd88b07259bc6012bce1eac5de58cfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6bbb6613117300efa6c728ba584bd5bdeccefa8fc50d1fe9864e603ac360fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6bbb6613117300efa6c728ba584bd5bdeccefa8fc50d1fe9864e603ac360fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15888f573d471aca492e61355f53b2471cb3b6f73b5329357afc1329dcee5d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15888f573d471aca492e61355f53b2471cb3b6f73b5329357afc1329dcee5d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a59f8ffca58355293ded83f23c41b7528ce508ce893c67a9b0ad699013737cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a59f8ffca58355293ded83f23c41b7528ce508ce893c67a9b0ad699013737cf2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.600198 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.620702 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.628239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.628280 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.628309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.628327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.628343 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:34Z","lastTransitionTime":"2026-02-17T14:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.631572 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.678234 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.706878 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.721577 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.731214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.731298 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.731312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.731332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.731368 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:34Z","lastTransitionTime":"2026-02-17T14:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.754640 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.770998 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.788333 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.813437 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:32Z\\\",\\\"message\\\":\\\"work-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/iptables-alerter-4ln5h]\\\\nI0217 14:53:32.965844 6786 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0217 14:53:32.965887 6786 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:32.965924 6786 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:32.965955 6786 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0217 14:53:32.965979 6786 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0217 14:53:32.966000 6786 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:32.966034 6786 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 14:53:32.966201 6786 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.834780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.834829 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.834847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.834870 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.834883 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:34Z","lastTransitionTime":"2026-02-17T14:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.863613 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 20:05:25.370026128 +0000 UTC Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.938625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.938673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.938682 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.938704 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:34 crc kubenswrapper[4717]: I0217 14:53:34.938718 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:34Z","lastTransitionTime":"2026-02-17T14:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.042878 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.042933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.042954 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.042977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.042991 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.145847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.145905 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.145915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.145933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.145947 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.222984 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.223053 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.223064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.223105 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.223116 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: E0217 14:53:35.244975 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.250190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.250253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.250267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.250291 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.250304 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: E0217 14:53:35.267986 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.272728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.272783 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.272802 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.272831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.272848 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: E0217 14:53:35.294139 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.299502 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.299642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.299662 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.299693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.299711 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: E0217 14:53:35.320818 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.325861 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.325945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.325964 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.325990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.326011 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: E0217 14:53:35.343051 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8454a012-5158-4995-9509-e8fe74ab1270\\\",\\\"systemUUID\\\":\\\"7c6444a0-8f9e-4f16-931b-2a332675c205\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:35 crc kubenswrapper[4717]: E0217 14:53:35.343209 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.345120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.345167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.345183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.345209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.345227 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.448711 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.448799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.448822 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.448855 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.448881 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.551851 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.551916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.551933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.551984 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.552001 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.654722 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.654789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.654825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.654861 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.654884 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.758432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.758524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.758547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.758577 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.758602 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.845990 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.846047 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:35 crc kubenswrapper[4717]: E0217 14:53:35.846394 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.846475 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:35 crc kubenswrapper[4717]: E0217 14:53:35.846555 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.845990 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:35 crc kubenswrapper[4717]: E0217 14:53:35.846655 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:35 crc kubenswrapper[4717]: E0217 14:53:35.846715 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.862213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.862287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.862312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.862343 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.862368 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.864752 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 06:25:00.947553985 +0000 UTC Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.869827 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b0823f1b7b7f653cb23aa1645170aeee5780d919e2e0bde70462ec0bbe90bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.901739 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:32Z\\\",\\\"message\\\":\\\"work-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/iptables-alerter-4ln5h]\\\\nI0217 14:53:32.965844 6786 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0217 14:53:32.965887 6786 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:32.965924 6786 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:32.965955 6786 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0217 14:53:32.965979 6786 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0217 14:53:32.966000 6786 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0217 14:53:32.966034 6786 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 14:53:32.966201 6786 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:53:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkszq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4f7wr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.919735 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pzb78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31d30c1-1e4a-49d3-adef-767a88616f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-65w24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pzb78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.956267 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb246cca-c319-441d-bf03-0dfb2bb44f9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2284915df4b96bac2fccb6b7949ae9c6622889d415597eab3aa5ed56e141466b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f1b4dbe4655348c21f9954dbf4c1452f40ab817e51f10e46af94c1b29d4874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd8124c77bdedf0a73606a94a6761ccb3194154d6375a5d7df1f2ae1abc1899\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.964956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.965029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.965052 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.965112 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.965134 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:35Z","lastTransitionTime":"2026-02-17T14:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:35 crc kubenswrapper[4717]: I0217 14:53:35.978564 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5c0095d-2b66-4009-a169-addd2dae6c89\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:52:31.717234 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:52:31.718116 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-325267206/tls.crt::/tmp/serving-cert-325267206/tls.key\\\\\\\"\\\\nI0217 14:52:37.149062 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:52:37.153244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:52:37.153266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:52:37.153286 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:52:37.153291 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:52:37.158990 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:52:37.159097 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:52:37.159161 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:52:37.159187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:52:37.159212 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:52:37.159235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:52:37.159013 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:52:37.160930 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.001030 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b754d9f081e24014f8fae0f2e533d0cf02c81e51c6c62837338f59bd78019453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.023768 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e24d71823005631f3f031f0bc951efc2afc9046ab722d8c02114dfa8afcbff5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630a25d6f00dfa734042383c70f3071dc2ad92eb0ec0d914a1c4779baa0229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.044129 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.062477 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"757e9eb2-a664-4fca-b745-5c3152a4c613\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cc92e501ac84d1d7e5756b2e4879210f85d2dccfc29a49728bf85e72ec9e43c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1360f46d3ef4ba0e147e5a0f6df374b887bb88e34940c632365b2a69875e393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7sx8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4hbqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.071493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.071573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.071597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.071628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.071654 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:36Z","lastTransitionTime":"2026-02-17T14:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.090867 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.109332 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xr7pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3306d20c-0893-4cc3-b35d-41b6365c7aaa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a05c6e5be76920ae37f599717ad4985baffaabf27196b85a11b18e779d93d42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xwkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xr7pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.130457 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08ffcd6a1b35bff8bae051eead1dc8d38ea0e0d10bde11a60157fcb0244d81ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5lq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dtt4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.152033 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nfvrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daa865c-6e58-4512-9be1-5d3a490a2f7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f9e46aae86320f6c1bcc534e3cbe373ef0df5082d7386e2e382d1c60228ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:53:27Z\\\",\\\"message\\\":\\\"2026-02-17T14:52:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4323f213-a8c5-44d2-aeec-3a86431d8c0a\\\\n2026-02-17T14:52:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4323f213-a8c5-44d2-aeec-3a86431d8c0a to /host/opt/cni/bin/\\\\n2026-02-17T14:52:42Z [verbose] multus-daemon started\\\\n2026-02-17T14:52:42Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:53:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:53:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2khb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nfvrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.171509 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1be0ab-a28a-4d7a-b871-d9fc8dea5841\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2572529bab39b64691303091fa128327f8ea667052943591867508986e0d2aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58a336c9c96a856dfd0d0b41ae8d519881491f961951ddc2093476414befb9d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461c447dbc25a914de57b14490ea4d611801a7c66e8c3790258b6b35a997f653\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7c83940edb2eb960591b0fc2bb6e779bbe4197ff9754ca6e18411943b0c516\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c505653ffa0c4f2ce6b190e10fb897192353668534d9156eab1d041837f48a73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cef623cd85c50744378dbace30b441b193420ccb5d3a6781ee6c58fece3102a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a7ac1aa04b16fc1058a44548ef1bfab9a79556a65afeb40798c8bdf73e44b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmrwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4n7g7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.174619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.174665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.174674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.174693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.174708 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:36Z","lastTransitionTime":"2026-02-17T14:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.187430 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"014dab04-a429-4107-89b9-53a382431c40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a40a2833a2d7bc1d9e94e0101eadfa1bc418f4143d686f41810b26566671bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff43e83143a6518fea4cb55cdc369649446cd2e15b3a776ac3df5f04f65a1744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1bef80b531a17e335240f260db84a0ed014c484239aa9d94d136ddf0188d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a5f85d8468815567750a13e9d5c1b63da4322dc4a257a378d2f9d6643d9bca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.202596 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe92c1f-25d7-43f8-8644-6295e9c25321\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e9be80a82dc2fb9996d26cdf005ecf8ac03a19141cac4a5928d632cfc1e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c9c57aa207fb49ee9dc0a98fc7623e19747a77c5d618ad0648c0de4edfae757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.225533 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b33a42a-1af6-4395-9331-d7aa78e0f2a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00b9dbac5f1b401bd29895d158a6375875fde57062dd9a9e3d50b662a4fd8c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96469e1b3a3e0c2a7a131f109884c7f6d1c4f72a8959c15f308ff276883687e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://849cfdcc83a8c65d74eaa79ec65157725c36f10f879a0e48aa3b9a9483886652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a24c45d759a685174f2796c6b4693de58dfdb082ccbc9b889857572694512ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a093154833c54fe11fcf8321bba7d1e5fd88b07259bc6012bce1eac5de58cfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6bbb6613117300efa6c728ba584bd5bdeccefa8fc50d1fe9864e603ac360fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6bbb6613117300efa6c728ba584bd5bdeccefa8fc50d1fe9864e603ac360fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15888f573d471aca492e61355f53b2471cb3b6f73b5329357afc1329dcee5d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15888f573d471aca492e61355f53b2471cb3b6f73b5329357afc1329dcee5d4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a59f8ffca58355293ded83f23c41b7528ce508ce893c67a9b0ad699013737cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a59f8ffca58355293ded83f23c41b7528ce508ce893c67a9b0ad699013737cf2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:52:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:52:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.237706 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4phxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64268f67-a8ad-41d1-a94a-13d926ad6022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d2f90a986f7f736822b56a64809fec44f3ee52935464fbbc772632f7878c7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5dnz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:52:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4phxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.249909 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:52:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:53:36Z is after 2025-08-24T17:21:41Z" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.277728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.277791 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.277815 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.277842 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.277863 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:36Z","lastTransitionTime":"2026-02-17T14:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.381149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.381199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.381209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.381235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.381249 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:36Z","lastTransitionTime":"2026-02-17T14:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.485325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.485411 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.485432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.485476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.485494 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:36Z","lastTransitionTime":"2026-02-17T14:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.590042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.590158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.590178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.590208 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.590229 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:36Z","lastTransitionTime":"2026-02-17T14:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.693992 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.694037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.694051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.694070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.694134 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:36Z","lastTransitionTime":"2026-02-17T14:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.797701 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.797784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.797810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.797843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.797863 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:36Z","lastTransitionTime":"2026-02-17T14:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.865156 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 13:14:58.568679375 +0000 UTC Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.901303 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.901372 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.901387 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.901409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:36 crc kubenswrapper[4717]: I0217 14:53:36.901426 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:36Z","lastTransitionTime":"2026-02-17T14:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.005545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.005625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.005644 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.005674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.005694 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:37Z","lastTransitionTime":"2026-02-17T14:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.109312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.109406 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.109436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.109476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.109497 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:37Z","lastTransitionTime":"2026-02-17T14:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.212071 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.212161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.212178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.212203 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.212221 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:37Z","lastTransitionTime":"2026-02-17T14:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.314787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.314843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.314855 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.314877 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.314893 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:37Z","lastTransitionTime":"2026-02-17T14:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.417975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.418064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.418116 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.418145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.418166 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:37Z","lastTransitionTime":"2026-02-17T14:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.521527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.521579 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.521592 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.521614 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.521626 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:37Z","lastTransitionTime":"2026-02-17T14:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.624645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.625069 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.625251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.625569 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.626321 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:37Z","lastTransitionTime":"2026-02-17T14:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.729574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.729650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.729673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.729706 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.729768 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:37Z","lastTransitionTime":"2026-02-17T14:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.832566 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.832618 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.832630 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.832650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.832664 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:37Z","lastTransitionTime":"2026-02-17T14:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.845878 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.846021 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.845908 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:37 crc kubenswrapper[4717]: E0217 14:53:37.846235 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:37 crc kubenswrapper[4717]: E0217 14:53:37.846330 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.846373 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:37 crc kubenswrapper[4717]: E0217 14:53:37.846568 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:37 crc kubenswrapper[4717]: E0217 14:53:37.846778 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.865728 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 05:47:01.371008127 +0000 UTC Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.936543 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.936615 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.936635 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.936662 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:37 crc kubenswrapper[4717]: I0217 14:53:37.936680 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:37Z","lastTransitionTime":"2026-02-17T14:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.039853 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.039992 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.040017 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.040046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.040123 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:38Z","lastTransitionTime":"2026-02-17T14:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.143987 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.144065 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.144117 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.144145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.144165 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:38Z","lastTransitionTime":"2026-02-17T14:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.251932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.252012 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.252037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.252117 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.252146 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:38Z","lastTransitionTime":"2026-02-17T14:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.356327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.356419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.356446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.356486 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.356513 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:38Z","lastTransitionTime":"2026-02-17T14:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.460131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.460164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.460178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.460195 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.460207 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:38Z","lastTransitionTime":"2026-02-17T14:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.564751 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.564830 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.564851 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.564877 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.564899 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:38Z","lastTransitionTime":"2026-02-17T14:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.667175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.667251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.667291 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.667526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.667574 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:38Z","lastTransitionTime":"2026-02-17T14:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.770412 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.770462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.770476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.770493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.770506 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:38Z","lastTransitionTime":"2026-02-17T14:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.866968 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 16:28:19.774014335 +0000 UTC Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.873855 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.873916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.873930 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.873952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.873998 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:38Z","lastTransitionTime":"2026-02-17T14:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.976600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.976662 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.976677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.976700 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:38 crc kubenswrapper[4717]: I0217 14:53:38.976712 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:38Z","lastTransitionTime":"2026-02-17T14:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.080006 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.080066 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.080097 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.080119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.080134 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:39Z","lastTransitionTime":"2026-02-17T14:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.183638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.183720 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.183736 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.183762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.183780 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:39Z","lastTransitionTime":"2026-02-17T14:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.287020 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.287075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.287115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.287139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.287153 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:39Z","lastTransitionTime":"2026-02-17T14:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.390792 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.390862 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.390881 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.390906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.390928 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:39Z","lastTransitionTime":"2026-02-17T14:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.495158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.495218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.495237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.495261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.495280 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:39Z","lastTransitionTime":"2026-02-17T14:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.599156 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.599230 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.599253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.599284 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.599307 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:39Z","lastTransitionTime":"2026-02-17T14:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.703265 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.703759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.704245 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.704293 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.704317 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:39Z","lastTransitionTime":"2026-02-17T14:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.808113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.808216 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.808233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.808259 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.808277 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:39Z","lastTransitionTime":"2026-02-17T14:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.846191 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:39 crc kubenswrapper[4717]: E0217 14:53:39.846476 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.846559 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.846694 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:39 crc kubenswrapper[4717]: E0217 14:53:39.846994 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.847109 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:39 crc kubenswrapper[4717]: E0217 14:53:39.847175 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:39 crc kubenswrapper[4717]: E0217 14:53:39.847269 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.867705 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 08:44:48.56687247 +0000 UTC Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.912556 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.912632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.912654 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.912681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:39 crc kubenswrapper[4717]: I0217 14:53:39.912704 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:39Z","lastTransitionTime":"2026-02-17T14:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.015582 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.015941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.016109 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.016220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.016304 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:40Z","lastTransitionTime":"2026-02-17T14:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.148120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.148839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.148984 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.149154 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.149295 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:40Z","lastTransitionTime":"2026-02-17T14:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.256479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.256562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.256580 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.256605 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.256630 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:40Z","lastTransitionTime":"2026-02-17T14:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.360565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.360646 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.360666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.360695 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.360717 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:40Z","lastTransitionTime":"2026-02-17T14:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.464454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.464515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.464529 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.464551 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.464564 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:40Z","lastTransitionTime":"2026-02-17T14:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.567634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.567717 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.567744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.567781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.567806 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:40Z","lastTransitionTime":"2026-02-17T14:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.671285 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.671350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.671367 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.671388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.671403 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:40Z","lastTransitionTime":"2026-02-17T14:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.775150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.775225 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.775245 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.775276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.775298 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:40Z","lastTransitionTime":"2026-02-17T14:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.868564 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 15:58:12.103396419 +0000 UTC Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.878952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.879003 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.879020 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.879046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.879067 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:40Z","lastTransitionTime":"2026-02-17T14:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.982003 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.982074 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.982125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.982150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:40 crc kubenswrapper[4717]: I0217 14:53:40.982163 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:40Z","lastTransitionTime":"2026-02-17T14:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.084873 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.084941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.084960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.084989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.085008 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:41Z","lastTransitionTime":"2026-02-17T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.189181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.189246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.189266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.189291 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.189314 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:41Z","lastTransitionTime":"2026-02-17T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.292361 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.292413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.292426 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.292459 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.292472 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:41Z","lastTransitionTime":"2026-02-17T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.396667 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.396812 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.396835 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.396869 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.396894 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:41Z","lastTransitionTime":"2026-02-17T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.500453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.500558 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.500583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.500615 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.500637 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:41Z","lastTransitionTime":"2026-02-17T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.603450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.603522 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.603534 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.603570 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.603584 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:41Z","lastTransitionTime":"2026-02-17T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.662209 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.662385 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.662352577 +0000 UTC m=+152.078193063 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.706393 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.706451 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.706469 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.706496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.706525 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:41Z","lastTransitionTime":"2026-02-17T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.763219 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.763299 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.763347 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.763375 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.763437 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.763473 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.763488 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.763547 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.763526732 +0000 UTC m=+152.179367198 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.763564 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.763613 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.763635 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.763634 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.763707 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.763686286 +0000 UTC m=+152.179526842 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.763766 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.763732057 +0000 UTC m=+152.179572733 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.763635 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.763815 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.763805989 +0000 UTC m=+152.179646685 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.808636 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.808714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.808727 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.808754 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.808770 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:41Z","lastTransitionTime":"2026-02-17T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.846544 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.846544 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.846777 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.846734 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.846568 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.846933 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.847150 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:41 crc kubenswrapper[4717]: E0217 14:53:41.847272 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.869281 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 04:19:13.674960378 +0000 UTC Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.912450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.912510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.912521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.912542 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:41 crc kubenswrapper[4717]: I0217 14:53:41.912554 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:41Z","lastTransitionTime":"2026-02-17T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.015937 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.016001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.016017 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.016041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.016059 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:42Z","lastTransitionTime":"2026-02-17T14:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.119934 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.119989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.120001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.120018 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.120031 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:42Z","lastTransitionTime":"2026-02-17T14:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.223837 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.223906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.223924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.223949 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.223968 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:42Z","lastTransitionTime":"2026-02-17T14:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.327256 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.327338 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.327357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.327383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.327403 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:42Z","lastTransitionTime":"2026-02-17T14:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.430516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.430565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.430577 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.430618 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.430631 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:42Z","lastTransitionTime":"2026-02-17T14:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.533407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.533469 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.533481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.533499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.533512 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:42Z","lastTransitionTime":"2026-02-17T14:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.637983 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.638164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.638197 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.638234 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.638269 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:42Z","lastTransitionTime":"2026-02-17T14:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.742975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.743030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.743044 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.743113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.743126 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:42Z","lastTransitionTime":"2026-02-17T14:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.846267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.846427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.846444 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.846465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.846477 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:42Z","lastTransitionTime":"2026-02-17T14:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.870482 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 09:28:43.222649278 +0000 UTC Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.950438 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.950511 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.950531 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.950596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:42 crc kubenswrapper[4717]: I0217 14:53:42.950622 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:42Z","lastTransitionTime":"2026-02-17T14:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.054954 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.055027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.055048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.055074 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.055130 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:43Z","lastTransitionTime":"2026-02-17T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.158935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.159007 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.159026 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.159058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.159075 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:43Z","lastTransitionTime":"2026-02-17T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.262845 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.262915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.262940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.262978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.263007 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:43Z","lastTransitionTime":"2026-02-17T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.365963 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.366033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.366054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.366109 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.366131 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:43Z","lastTransitionTime":"2026-02-17T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.469132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.469197 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.469216 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.469239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.469258 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:43Z","lastTransitionTime":"2026-02-17T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.573138 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.573218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.573245 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.573282 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.573306 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:43Z","lastTransitionTime":"2026-02-17T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.676621 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.676700 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.676726 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.676760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.676784 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:43Z","lastTransitionTime":"2026-02-17T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.780899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.780987 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.781010 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.781040 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.781067 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:43Z","lastTransitionTime":"2026-02-17T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.846656 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.846776 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.846984 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.847032 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:43 crc kubenswrapper[4717]: E0217 14:53:43.847212 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:43 crc kubenswrapper[4717]: E0217 14:53:43.847446 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:43 crc kubenswrapper[4717]: E0217 14:53:43.847500 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:43 crc kubenswrapper[4717]: E0217 14:53:43.847562 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.871648 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 20:40:54.936645907 +0000 UTC Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.884217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.884248 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.884257 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.884271 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.884283 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:43Z","lastTransitionTime":"2026-02-17T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.987890 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.988477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.988492 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.988513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:43 crc kubenswrapper[4717]: I0217 14:53:43.988528 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:43Z","lastTransitionTime":"2026-02-17T14:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.091676 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.091737 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.091753 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.091781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.091800 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:44Z","lastTransitionTime":"2026-02-17T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.195253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.195300 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.195310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.195327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.195337 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:44Z","lastTransitionTime":"2026-02-17T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.297785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.297846 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.297858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.297880 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.297892 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:44Z","lastTransitionTime":"2026-02-17T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.400732 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.400811 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.400839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.400871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.400893 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:44Z","lastTransitionTime":"2026-02-17T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.504158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.504212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.504229 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.504250 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.504263 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:44Z","lastTransitionTime":"2026-02-17T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.606901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.606969 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.606986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.607010 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.607023 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:44Z","lastTransitionTime":"2026-02-17T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.710450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.710513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.710530 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.710557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.710575 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:44Z","lastTransitionTime":"2026-02-17T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.813788 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.813899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.813939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.813974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.813996 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:44Z","lastTransitionTime":"2026-02-17T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.872581 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 08:01:56.114005381 +0000 UTC Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.916972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.917029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.917042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.917064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:44 crc kubenswrapper[4717]: I0217 14:53:44.917106 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:44Z","lastTransitionTime":"2026-02-17T14:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.021056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.021165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.021188 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.021219 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.021242 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:45Z","lastTransitionTime":"2026-02-17T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.124978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.125053 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.125069 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.125112 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.125131 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:45Z","lastTransitionTime":"2026-02-17T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.227906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.227960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.227975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.227993 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.228007 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:45Z","lastTransitionTime":"2026-02-17T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.331511 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.331551 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.331565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.331584 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.331594 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:45Z","lastTransitionTime":"2026-02-17T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.435130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.435265 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.435288 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.435313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.435331 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:45Z","lastTransitionTime":"2026-02-17T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.437922 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.437955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.437967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.437980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.437989 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:53:45Z","lastTransitionTime":"2026-02-17T14:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.502256 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g"] Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.502910 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.506349 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.506462 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.506831 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.507695 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.607624 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8081465b-729a-4ac5-be0e-426d245b32f5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.607672 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8081465b-729a-4ac5-be0e-426d245b32f5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.607694 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8081465b-729a-4ac5-be0e-426d245b32f5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.607721 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8081465b-729a-4ac5-be0e-426d245b32f5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.607910 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8081465b-729a-4ac5-be0e-426d245b32f5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.624065 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4hbqf" podStartSLOduration=66.6240293 podStartE2EDuration="1m6.6240293s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:53:45.600875069 +0000 UTC m=+92.016715565" watchObservedRunningTime="2026-02-17 14:53:45.6240293 +0000 UTC m=+92.039869816" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.640730 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4n7g7" podStartSLOduration=66.640699672 podStartE2EDuration="1m6.640699672s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:53:45.626711869 +0000 UTC m=+92.042552425" watchObservedRunningTime="2026-02-17 14:53:45.640699672 +0000 UTC m=+92.056540148" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.641097 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=33.641074922 podStartE2EDuration="33.641074922s" podCreationTimestamp="2026-02-17 14:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:53:45.639926392 +0000 UTC m=+92.055766908" watchObservedRunningTime="2026-02-17 14:53:45.641074922 +0000 UTC m=+92.056915398" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.659025 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=25.658998467 podStartE2EDuration="25.658998467s" podCreationTimestamp="2026-02-17 14:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:53:45.657995551 +0000 UTC m=+92.073836047" watchObservedRunningTime="2026-02-17 14:53:45.658998467 +0000 UTC m=+92.074838953" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.690113 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=12.690065393 podStartE2EDuration="12.690065393s" podCreationTimestamp="2026-02-17 14:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:53:45.689277123 +0000 UTC m=+92.105117649" watchObservedRunningTime="2026-02-17 14:53:45.690065393 +0000 UTC m=+92.105905879" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.709716 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8081465b-729a-4ac5-be0e-426d245b32f5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.709806 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8081465b-729a-4ac5-be0e-426d245b32f5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.709842 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8081465b-729a-4ac5-be0e-426d245b32f5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.709869 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8081465b-729a-4ac5-be0e-426d245b32f5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.709925 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8081465b-729a-4ac5-be0e-426d245b32f5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.709973 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8081465b-729a-4ac5-be0e-426d245b32f5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.710066 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8081465b-729a-4ac5-be0e-426d245b32f5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.710794 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8081465b-729a-4ac5-be0e-426d245b32f5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.719071 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8081465b-729a-4ac5-be0e-426d245b32f5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.731409 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8081465b-729a-4ac5-be0e-426d245b32f5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2vl6g\" (UID: \"8081465b-729a-4ac5-be0e-426d245b32f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.743603 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podStartSLOduration=66.743584541 podStartE2EDuration="1m6.743584541s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:53:45.743267243 +0000 UTC m=+92.159107749" watchObservedRunningTime="2026-02-17 14:53:45.743584541 +0000 UTC m=+92.159425027" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.744733 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xr7pz" podStartSLOduration=67.744724841 podStartE2EDuration="1m7.744724841s" podCreationTimestamp="2026-02-17 14:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:53:45.727299309 +0000 UTC m=+92.143139785" watchObservedRunningTime="2026-02-17 14:53:45.744724841 +0000 UTC m=+92.160565327" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.799727 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nfvrt" podStartSLOduration=66.799699807 podStartE2EDuration="1m6.799699807s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:53:45.782489171 +0000 UTC m=+92.198329657" watchObservedRunningTime="2026-02-17 14:53:45.799699807 +0000 UTC m=+92.215540273" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.820139 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4phxm" podStartSLOduration=67.820121857 podStartE2EDuration="1m7.820121857s" podCreationTimestamp="2026-02-17 14:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:53:45.799642526 +0000 UTC m=+92.215483002" watchObservedRunningTime="2026-02-17 14:53:45.820121857 +0000 UTC m=+92.235962333" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.826738 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.849189 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:45 crc kubenswrapper[4717]: E0217 14:53:45.849310 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.849329 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:45 crc kubenswrapper[4717]: E0217 14:53:45.849467 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.849515 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:45 crc kubenswrapper[4717]: E0217 14:53:45.850282 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.851904 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:45 crc kubenswrapper[4717]: E0217 14:53:45.852303 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.860124 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=66.860081094 podStartE2EDuration="1m6.860081094s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:53:45.859557941 +0000 UTC m=+92.275398417" watchObservedRunningTime="2026-02-17 14:53:45.860081094 +0000 UTC m=+92.275921570" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.873152 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 21:23:43.326625215 +0000 UTC Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.874483 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.879695 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=67.879674712 podStartE2EDuration="1m7.879674712s" podCreationTimestamp="2026-02-17 14:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:53:45.879149118 +0000 UTC m=+92.294989604" watchObservedRunningTime="2026-02-17 14:53:45.879674712 +0000 UTC m=+92.295515188" Feb 17 14:53:45 crc kubenswrapper[4717]: I0217 14:53:45.884407 4717 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 14:53:46 crc kubenswrapper[4717]: I0217 14:53:46.424753 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" event={"ID":"8081465b-729a-4ac5-be0e-426d245b32f5","Type":"ContainerStarted","Data":"f3660f71c9337c962f589488ddcd4b2a4e27242a7bfad46754465d308075f1f0"} Feb 17 14:53:46 crc kubenswrapper[4717]: I0217 14:53:46.424820 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" event={"ID":"8081465b-729a-4ac5-be0e-426d245b32f5","Type":"ContainerStarted","Data":"e3e1e229928ef1fcf34dbbc1427bad17a9fdb3a7d0d2148ec55736514fc35525"} Feb 17 14:53:47 crc kubenswrapper[4717]: I0217 14:53:47.846487 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:47 crc kubenswrapper[4717]: I0217 14:53:47.846926 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:47 crc kubenswrapper[4717]: I0217 14:53:47.846998 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:47 crc kubenswrapper[4717]: E0217 14:53:47.847193 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:47 crc kubenswrapper[4717]: I0217 14:53:47.847496 4717 scope.go:117] "RemoveContainer" containerID="19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b" Feb 17 14:53:47 crc kubenswrapper[4717]: I0217 14:53:47.847553 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:47 crc kubenswrapper[4717]: E0217 14:53:47.847713 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:47 crc kubenswrapper[4717]: E0217 14:53:47.847779 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" Feb 17 14:53:47 crc kubenswrapper[4717]: E0217 14:53:47.847823 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:47 crc kubenswrapper[4717]: E0217 14:53:47.847906 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:49 crc kubenswrapper[4717]: I0217 14:53:49.846692 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:49 crc kubenswrapper[4717]: I0217 14:53:49.846853 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:49 crc kubenswrapper[4717]: E0217 14:53:49.846890 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:49 crc kubenswrapper[4717]: I0217 14:53:49.846970 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:49 crc kubenswrapper[4717]: E0217 14:53:49.847214 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:49 crc kubenswrapper[4717]: I0217 14:53:49.847542 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:49 crc kubenswrapper[4717]: E0217 14:53:49.847757 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:49 crc kubenswrapper[4717]: E0217 14:53:49.847860 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:51 crc kubenswrapper[4717]: I0217 14:53:51.846784 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:51 crc kubenswrapper[4717]: I0217 14:53:51.846804 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:51 crc kubenswrapper[4717]: I0217 14:53:51.846834 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:51 crc kubenswrapper[4717]: I0217 14:53:51.846980 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:51 crc kubenswrapper[4717]: E0217 14:53:51.847199 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:51 crc kubenswrapper[4717]: E0217 14:53:51.847325 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:51 crc kubenswrapper[4717]: E0217 14:53:51.847444 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:51 crc kubenswrapper[4717]: E0217 14:53:51.847594 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:53 crc kubenswrapper[4717]: I0217 14:53:53.846100 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:53 crc kubenswrapper[4717]: I0217 14:53:53.846183 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:53 crc kubenswrapper[4717]: I0217 14:53:53.846276 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:53 crc kubenswrapper[4717]: I0217 14:53:53.846118 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:53 crc kubenswrapper[4717]: E0217 14:53:53.846282 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:53 crc kubenswrapper[4717]: E0217 14:53:53.846465 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:53 crc kubenswrapper[4717]: E0217 14:53:53.846583 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:53 crc kubenswrapper[4717]: E0217 14:53:53.846663 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:55 crc kubenswrapper[4717]: I0217 14:53:55.846044 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:55 crc kubenswrapper[4717]: I0217 14:53:55.846236 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:55 crc kubenswrapper[4717]: I0217 14:53:55.846325 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:55 crc kubenswrapper[4717]: I0217 14:53:55.846451 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:55 crc kubenswrapper[4717]: E0217 14:53:55.848959 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:55 crc kubenswrapper[4717]: E0217 14:53:55.849124 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:55 crc kubenswrapper[4717]: E0217 14:53:55.849210 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:55 crc kubenswrapper[4717]: E0217 14:53:55.849250 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:57 crc kubenswrapper[4717]: I0217 14:53:57.846570 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:57 crc kubenswrapper[4717]: I0217 14:53:57.846611 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:57 crc kubenswrapper[4717]: I0217 14:53:57.846703 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:57 crc kubenswrapper[4717]: I0217 14:53:57.846826 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:57 crc kubenswrapper[4717]: E0217 14:53:57.848372 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:57 crc kubenswrapper[4717]: E0217 14:53:57.848322 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:57 crc kubenswrapper[4717]: E0217 14:53:57.848593 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:57 crc kubenswrapper[4717]: E0217 14:53:57.848261 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:57 crc kubenswrapper[4717]: I0217 14:53:57.849167 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:57 crc kubenswrapper[4717]: E0217 14:53:57.849315 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:53:57 crc kubenswrapper[4717]: E0217 14:53:57.849680 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs podName:f31d30c1-1e4a-49d3-adef-767a88616f33 nodeName:}" failed. No retries permitted until 2026-02-17 14:55:01.849644899 +0000 UTC m=+168.265485415 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs") pod "network-metrics-daemon-pzb78" (UID: "f31d30c1-1e4a-49d3-adef-767a88616f33") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:53:59 crc kubenswrapper[4717]: I0217 14:53:59.846315 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:53:59 crc kubenswrapper[4717]: I0217 14:53:59.846376 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:53:59 crc kubenswrapper[4717]: I0217 14:53:59.846385 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:53:59 crc kubenswrapper[4717]: E0217 14:53:59.846590 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:53:59 crc kubenswrapper[4717]: I0217 14:53:59.846676 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:53:59 crc kubenswrapper[4717]: E0217 14:53:59.846881 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:53:59 crc kubenswrapper[4717]: E0217 14:53:59.847032 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:53:59 crc kubenswrapper[4717]: E0217 14:53:59.848013 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:53:59 crc kubenswrapper[4717]: I0217 14:53:59.848661 4717 scope.go:117] "RemoveContainer" containerID="19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b" Feb 17 14:53:59 crc kubenswrapper[4717]: E0217 14:53:59.849171 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4f7wr_openshift-ovn-kubernetes(c5c8492f-64dc-4b1a-8041-d45d5ebb04f6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" Feb 17 14:54:01 crc kubenswrapper[4717]: I0217 14:54:01.846680 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:01 crc kubenswrapper[4717]: I0217 14:54:01.846765 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:01 crc kubenswrapper[4717]: I0217 14:54:01.846681 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:01 crc kubenswrapper[4717]: E0217 14:54:01.846876 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:01 crc kubenswrapper[4717]: E0217 14:54:01.846944 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:01 crc kubenswrapper[4717]: I0217 14:54:01.846992 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:01 crc kubenswrapper[4717]: E0217 14:54:01.847133 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:01 crc kubenswrapper[4717]: E0217 14:54:01.847400 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:03 crc kubenswrapper[4717]: I0217 14:54:03.846713 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:03 crc kubenswrapper[4717]: I0217 14:54:03.846868 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:03 crc kubenswrapper[4717]: I0217 14:54:03.846935 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:03 crc kubenswrapper[4717]: E0217 14:54:03.846939 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:03 crc kubenswrapper[4717]: I0217 14:54:03.846961 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:03 crc kubenswrapper[4717]: E0217 14:54:03.847217 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:03 crc kubenswrapper[4717]: E0217 14:54:03.847258 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:03 crc kubenswrapper[4717]: E0217 14:54:03.847355 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:05 crc kubenswrapper[4717]: I0217 14:54:05.846476 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:05 crc kubenswrapper[4717]: I0217 14:54:05.846507 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:05 crc kubenswrapper[4717]: I0217 14:54:05.846573 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:05 crc kubenswrapper[4717]: E0217 14:54:05.847445 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:05 crc kubenswrapper[4717]: I0217 14:54:05.847505 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:05 crc kubenswrapper[4717]: E0217 14:54:05.847725 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:05 crc kubenswrapper[4717]: E0217 14:54:05.848019 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:05 crc kubenswrapper[4717]: E0217 14:54:05.848101 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:07 crc kubenswrapper[4717]: I0217 14:54:07.846508 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:07 crc kubenswrapper[4717]: I0217 14:54:07.846550 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:07 crc kubenswrapper[4717]: E0217 14:54:07.846725 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:07 crc kubenswrapper[4717]: I0217 14:54:07.847069 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:07 crc kubenswrapper[4717]: E0217 14:54:07.847212 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:07 crc kubenswrapper[4717]: I0217 14:54:07.847400 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:07 crc kubenswrapper[4717]: E0217 14:54:07.847492 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:07 crc kubenswrapper[4717]: E0217 14:54:07.847709 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:09 crc kubenswrapper[4717]: I0217 14:54:09.846520 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:09 crc kubenswrapper[4717]: I0217 14:54:09.846652 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:09 crc kubenswrapper[4717]: E0217 14:54:09.846798 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:09 crc kubenswrapper[4717]: I0217 14:54:09.847044 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:09 crc kubenswrapper[4717]: I0217 14:54:09.847073 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:09 crc kubenswrapper[4717]: E0217 14:54:09.847163 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:09 crc kubenswrapper[4717]: E0217 14:54:09.847241 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:09 crc kubenswrapper[4717]: E0217 14:54:09.847346 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:11 crc kubenswrapper[4717]: I0217 14:54:11.846057 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:11 crc kubenswrapper[4717]: E0217 14:54:11.846902 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:11 crc kubenswrapper[4717]: I0217 14:54:11.846385 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:11 crc kubenswrapper[4717]: I0217 14:54:11.846605 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:11 crc kubenswrapper[4717]: E0217 14:54:11.847137 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:11 crc kubenswrapper[4717]: E0217 14:54:11.847325 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:11 crc kubenswrapper[4717]: I0217 14:54:11.846158 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:11 crc kubenswrapper[4717]: E0217 14:54:11.848012 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:13 crc kubenswrapper[4717]: I0217 14:54:13.846528 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:13 crc kubenswrapper[4717]: E0217 14:54:13.846709 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:13 crc kubenswrapper[4717]: I0217 14:54:13.846958 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:13 crc kubenswrapper[4717]: E0217 14:54:13.847011 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:13 crc kubenswrapper[4717]: I0217 14:54:13.847165 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:13 crc kubenswrapper[4717]: E0217 14:54:13.847220 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:13 crc kubenswrapper[4717]: I0217 14:54:13.847430 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:13 crc kubenswrapper[4717]: E0217 14:54:13.847488 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:14 crc kubenswrapper[4717]: I0217 14:54:14.532372 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nfvrt_3daa865c-6e58-4512-9be1-5d3a490a2f7a/kube-multus/1.log" Feb 17 14:54:14 crc kubenswrapper[4717]: I0217 14:54:14.533010 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nfvrt_3daa865c-6e58-4512-9be1-5d3a490a2f7a/kube-multus/0.log" Feb 17 14:54:14 crc kubenswrapper[4717]: I0217 14:54:14.533062 4717 generic.go:334] "Generic (PLEG): container finished" podID="3daa865c-6e58-4512-9be1-5d3a490a2f7a" containerID="01f9e46aae86320f6c1bcc534e3cbe373ef0df5082d7386e2e382d1c60228ca6" exitCode=1 Feb 17 14:54:14 crc kubenswrapper[4717]: I0217 14:54:14.533132 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nfvrt" event={"ID":"3daa865c-6e58-4512-9be1-5d3a490a2f7a","Type":"ContainerDied","Data":"01f9e46aae86320f6c1bcc534e3cbe373ef0df5082d7386e2e382d1c60228ca6"} Feb 17 14:54:14 crc kubenswrapper[4717]: I0217 14:54:14.533188 4717 scope.go:117] "RemoveContainer" containerID="fbfd905110c362ca6300dd21de21abbefb5fdb559b1f6faec93d07e43dfa5557" Feb 17 14:54:14 crc kubenswrapper[4717]: I0217 14:54:14.534194 4717 scope.go:117] "RemoveContainer" containerID="01f9e46aae86320f6c1bcc534e3cbe373ef0df5082d7386e2e382d1c60228ca6" Feb 17 14:54:14 crc kubenswrapper[4717]: E0217 14:54:14.535349 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-nfvrt_openshift-multus(3daa865c-6e58-4512-9be1-5d3a490a2f7a)\"" pod="openshift-multus/multus-nfvrt" podUID="3daa865c-6e58-4512-9be1-5d3a490a2f7a" Feb 17 14:54:14 crc kubenswrapper[4717]: I0217 14:54:14.561129 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2vl6g" podStartSLOduration=95.561061926 podStartE2EDuration="1m35.561061926s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:53:46.448548319 +0000 UTC m=+92.864388885" watchObservedRunningTime="2026-02-17 14:54:14.561061926 +0000 UTC m=+120.976902442" Feb 17 14:54:14 crc kubenswrapper[4717]: I0217 14:54:14.847526 4717 scope.go:117] "RemoveContainer" containerID="19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b" Feb 17 14:54:15 crc kubenswrapper[4717]: I0217 14:54:15.538568 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/3.log" Feb 17 14:54:15 crc kubenswrapper[4717]: I0217 14:54:15.541349 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerStarted","Data":"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822"} Feb 17 14:54:15 crc kubenswrapper[4717]: I0217 14:54:15.541904 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:54:15 crc kubenswrapper[4717]: I0217 14:54:15.543019 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nfvrt_3daa865c-6e58-4512-9be1-5d3a490a2f7a/kube-multus/1.log" Feb 17 14:54:15 crc kubenswrapper[4717]: I0217 14:54:15.577941 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podStartSLOduration=96.577913595 podStartE2EDuration="1m36.577913595s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:15.577602887 +0000 UTC m=+121.993443373" watchObservedRunningTime="2026-02-17 14:54:15.577913595 +0000 UTC m=+121.993754071" Feb 17 14:54:15 crc kubenswrapper[4717]: E0217 14:54:15.814650 4717 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 17 14:54:15 crc kubenswrapper[4717]: I0217 14:54:15.845976 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:15 crc kubenswrapper[4717]: I0217 14:54:15.845979 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:15 crc kubenswrapper[4717]: I0217 14:54:15.846037 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:15 crc kubenswrapper[4717]: E0217 14:54:15.848051 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:15 crc kubenswrapper[4717]: I0217 14:54:15.848496 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:15 crc kubenswrapper[4717]: E0217 14:54:15.848615 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:15 crc kubenswrapper[4717]: E0217 14:54:15.848837 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:15 crc kubenswrapper[4717]: E0217 14:54:15.848935 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:15 crc kubenswrapper[4717]: I0217 14:54:15.869609 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pzb78"] Feb 17 14:54:15 crc kubenswrapper[4717]: E0217 14:54:15.937597 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:54:16 crc kubenswrapper[4717]: I0217 14:54:16.546142 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:16 crc kubenswrapper[4717]: E0217 14:54:16.546296 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:17 crc kubenswrapper[4717]: I0217 14:54:17.846663 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:17 crc kubenswrapper[4717]: I0217 14:54:17.846706 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:17 crc kubenswrapper[4717]: I0217 14:54:17.846714 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:17 crc kubenswrapper[4717]: I0217 14:54:17.846790 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:17 crc kubenswrapper[4717]: E0217 14:54:17.846850 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:17 crc kubenswrapper[4717]: E0217 14:54:17.847005 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:17 crc kubenswrapper[4717]: E0217 14:54:17.847122 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:17 crc kubenswrapper[4717]: E0217 14:54:17.847250 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:19 crc kubenswrapper[4717]: I0217 14:54:19.846600 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:19 crc kubenswrapper[4717]: I0217 14:54:19.846667 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:19 crc kubenswrapper[4717]: I0217 14:54:19.846631 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:19 crc kubenswrapper[4717]: I0217 14:54:19.846622 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:19 crc kubenswrapper[4717]: E0217 14:54:19.846908 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:19 crc kubenswrapper[4717]: E0217 14:54:19.847047 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:19 crc kubenswrapper[4717]: E0217 14:54:19.847192 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:19 crc kubenswrapper[4717]: E0217 14:54:19.847299 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:20 crc kubenswrapper[4717]: E0217 14:54:20.939235 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:54:21 crc kubenswrapper[4717]: I0217 14:54:21.846597 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:21 crc kubenswrapper[4717]: I0217 14:54:21.846676 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:21 crc kubenswrapper[4717]: E0217 14:54:21.846819 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:21 crc kubenswrapper[4717]: I0217 14:54:21.847207 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:21 crc kubenswrapper[4717]: I0217 14:54:21.847213 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:21 crc kubenswrapper[4717]: E0217 14:54:21.847349 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:21 crc kubenswrapper[4717]: E0217 14:54:21.847510 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:21 crc kubenswrapper[4717]: E0217 14:54:21.847628 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:23 crc kubenswrapper[4717]: I0217 14:54:23.846535 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:23 crc kubenswrapper[4717]: I0217 14:54:23.846552 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:23 crc kubenswrapper[4717]: E0217 14:54:23.846725 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:23 crc kubenswrapper[4717]: E0217 14:54:23.846920 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:23 crc kubenswrapper[4717]: I0217 14:54:23.847437 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:23 crc kubenswrapper[4717]: I0217 14:54:23.847557 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:23 crc kubenswrapper[4717]: E0217 14:54:23.847729 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:23 crc kubenswrapper[4717]: E0217 14:54:23.847838 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:25 crc kubenswrapper[4717]: I0217 14:54:25.846198 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:25 crc kubenswrapper[4717]: I0217 14:54:25.846275 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:25 crc kubenswrapper[4717]: I0217 14:54:25.846394 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:25 crc kubenswrapper[4717]: E0217 14:54:25.848955 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:25 crc kubenswrapper[4717]: I0217 14:54:25.848988 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:25 crc kubenswrapper[4717]: E0217 14:54:25.849376 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:25 crc kubenswrapper[4717]: E0217 14:54:25.849109 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:25 crc kubenswrapper[4717]: E0217 14:54:25.849022 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:25 crc kubenswrapper[4717]: E0217 14:54:25.940970 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:54:26 crc kubenswrapper[4717]: I0217 14:54:26.847396 4717 scope.go:117] "RemoveContainer" containerID="01f9e46aae86320f6c1bcc534e3cbe373ef0df5082d7386e2e382d1c60228ca6" Feb 17 14:54:27 crc kubenswrapper[4717]: I0217 14:54:27.589699 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nfvrt_3daa865c-6e58-4512-9be1-5d3a490a2f7a/kube-multus/1.log" Feb 17 14:54:27 crc kubenswrapper[4717]: I0217 14:54:27.590765 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nfvrt" event={"ID":"3daa865c-6e58-4512-9be1-5d3a490a2f7a","Type":"ContainerStarted","Data":"a852fe0b82a2d6ba3bce6e311bef1cdc6fdad339fd7922aa1007be30e0774e55"} Feb 17 14:54:27 crc kubenswrapper[4717]: I0217 14:54:27.846609 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:27 crc kubenswrapper[4717]: I0217 14:54:27.846657 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:27 crc kubenswrapper[4717]: I0217 14:54:27.846761 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:27 crc kubenswrapper[4717]: I0217 14:54:27.846789 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:27 crc kubenswrapper[4717]: E0217 14:54:27.846898 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:27 crc kubenswrapper[4717]: E0217 14:54:27.847128 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:27 crc kubenswrapper[4717]: E0217 14:54:27.847213 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:27 crc kubenswrapper[4717]: E0217 14:54:27.847392 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:29 crc kubenswrapper[4717]: I0217 14:54:29.846333 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:29 crc kubenswrapper[4717]: I0217 14:54:29.846397 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:29 crc kubenswrapper[4717]: I0217 14:54:29.846496 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:29 crc kubenswrapper[4717]: E0217 14:54:29.846569 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:54:29 crc kubenswrapper[4717]: I0217 14:54:29.846620 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:29 crc kubenswrapper[4717]: E0217 14:54:29.846771 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:54:29 crc kubenswrapper[4717]: E0217 14:54:29.846906 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:54:29 crc kubenswrapper[4717]: E0217 14:54:29.847029 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pzb78" podUID="f31d30c1-1e4a-49d3-adef-767a88616f33" Feb 17 14:54:31 crc kubenswrapper[4717]: I0217 14:54:31.846332 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:31 crc kubenswrapper[4717]: I0217 14:54:31.846424 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:31 crc kubenswrapper[4717]: I0217 14:54:31.846371 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:31 crc kubenswrapper[4717]: I0217 14:54:31.846333 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:54:31 crc kubenswrapper[4717]: I0217 14:54:31.851968 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 14:54:31 crc kubenswrapper[4717]: I0217 14:54:31.852317 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 14:54:31 crc kubenswrapper[4717]: I0217 14:54:31.852524 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 14:54:31 crc kubenswrapper[4717]: I0217 14:54:31.852551 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 14:54:31 crc kubenswrapper[4717]: I0217 14:54:31.852621 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 14:54:31 crc kubenswrapper[4717]: I0217 14:54:31.853174 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 14:54:35 crc kubenswrapper[4717]: I0217 14:54:35.237734 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.630320 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.688716 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.689513 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9t64v"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.689745 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.690427 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.691584 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-549dm"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.692139 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.696268 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6lpz6"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.696708 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.696783 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.696731 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.696881 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.697005 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.697072 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.697729 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.699967 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.701536 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xfb64"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.702801 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nfj6t"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.703066 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.703210 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.703468 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.703562 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.703812 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.703922 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.704048 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.704290 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.704432 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.704847 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.705243 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.705399 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.705645 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.705684 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.705737 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.705954 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.706129 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.707377 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.707912 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.709578 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.711285 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.712547 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.713489 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.714239 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.714321 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.752824 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.753009 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.753213 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dc1eca03-6063-41fd-bcab-799db27b8f23-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.753256 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-serving-cert\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.753301 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-etcd-client\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.753301 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.753356 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.753424 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.753331 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc1eca03-6063-41fd-bcab-799db27b8f23-audit-dir\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.753571 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d82cfc5-b2ed-4877-b4ef-e0f55b340fef-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vd4m7\" (UID: \"8d82cfc5-b2ed-4877-b4ef-e0f55b340fef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.753747 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.754441 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc1eca03-6063-41fd-bcab-799db27b8f23-audit-policies\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.754498 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc1eca03-6063-41fd-bcab-799db27b8f23-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.754543 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9-config\") pod \"machine-approver-56656f9798-4jdgk\" (UID: \"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.754605 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftsnz\" (UniqueName: \"kubernetes.io/projected/8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9-kube-api-access-ftsnz\") pod \"machine-approver-56656f9798-4jdgk\" (UID: \"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.754633 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35e2909f-4827-4ad7-a8f2-b464f982967f-serving-cert\") pod \"route-controller-manager-6576b87f9c-s8hhh\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.754683 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9-machine-approver-tls\") pod \"machine-approver-56656f9798-4jdgk\" (UID: \"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.754712 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e2909f-4827-4ad7-a8f2-b464f982967f-config\") pod \"route-controller-manager-6576b87f9c-s8hhh\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.754772 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9ac42c-b34a-4fff-b611-176fe523d337-config\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.754920 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dc1eca03-6063-41fd-bcab-799db27b8f23-encryption-config\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.755194 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-audit-dir\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.755225 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7rgv\" (UniqueName: \"kubernetes.io/projected/8d82cfc5-b2ed-4877-b4ef-e0f55b340fef-kube-api-access-v7rgv\") pod \"openshift-apiserver-operator-796bbdcf4f-vd4m7\" (UID: \"8d82cfc5-b2ed-4877-b4ef-e0f55b340fef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.755350 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkvpt\" (UniqueName: \"kubernetes.io/projected/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-kube-api-access-mkvpt\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.755375 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftt59\" (UniqueName: \"kubernetes.io/projected/ba04adcf-e228-4486-a74a-e9846bdaa53f-kube-api-access-ftt59\") pod \"machine-api-operator-5694c8668f-9t64v\" (UID: \"ba04adcf-e228-4486-a74a-e9846bdaa53f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.755504 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d82cfc5-b2ed-4877-b4ef-e0f55b340fef-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vd4m7\" (UID: \"8d82cfc5-b2ed-4877-b4ef-e0f55b340fef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.755545 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35e2909f-4827-4ad7-a8f2-b464f982967f-client-ca\") pod \"route-controller-manager-6576b87f9c-s8hhh\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.751927 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.756366 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.756546 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.757315 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.758193 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d9ac42c-b34a-4fff-b611-176fe523d337-service-ca-bundle\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.758360 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9-auth-proxy-config\") pod \"machine-approver-56656f9798-4jdgk\" (UID: \"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.758514 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d9ac42c-b34a-4fff-b611-176fe523d337-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.758533 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkftq\" (UniqueName: \"kubernetes.io/projected/3d9ac42c-b34a-4fff-b611-176fe523d337-kube-api-access-mkftq\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.758651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-config\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.758681 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc1eca03-6063-41fd-bcab-799db27b8f23-etcd-client\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.758697 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-node-pullsecrets\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.758843 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-encryption-config\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.758865 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc1eca03-6063-41fd-bcab-799db27b8f23-serving-cert\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.758883 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba04adcf-e228-4486-a74a-e9846bdaa53f-config\") pod \"machine-api-operator-5694c8668f-9t64v\" (UID: \"ba04adcf-e228-4486-a74a-e9846bdaa53f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.759062 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ba04adcf-e228-4486-a74a-e9846bdaa53f-images\") pod \"machine-api-operator-5694c8668f-9t64v\" (UID: \"ba04adcf-e228-4486-a74a-e9846bdaa53f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.759109 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-audit\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.759126 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-etcd-serving-ca\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.759142 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnssg\" (UniqueName: \"kubernetes.io/projected/dc1eca03-6063-41fd-bcab-799db27b8f23-kube-api-access-rnssg\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.759167 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxqdd\" (UniqueName: \"kubernetes.io/projected/35e2909f-4827-4ad7-a8f2-b464f982967f-kube-api-access-cxqdd\") pod \"route-controller-manager-6576b87f9c-s8hhh\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.759217 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-fbqb2"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.760106 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9ac42c-b34a-4fff-b611-176fe523d337-serving-cert\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.766352 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-image-import-ca\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.766466 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba04adcf-e228-4486-a74a-e9846bdaa53f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9t64v\" (UID: \"ba04adcf-e228-4486-a74a-e9846bdaa53f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.767293 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768044 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768210 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768228 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768251 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768333 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768457 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768592 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768622 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768678 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768756 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768799 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768822 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768870 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768893 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768921 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768936 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769000 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768691 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769147 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768701 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769210 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769244 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769339 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769379 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769426 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769470 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769508 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769571 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769638 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769676 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769693 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.769343 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.768468 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.772583 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.773410 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.774294 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-r2gmn"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.774651 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-r2gmn" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.780414 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.780820 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.780916 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.781282 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.784686 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kdfsw"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.784819 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.785075 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.785295 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-549dm"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.785380 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.785531 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.786016 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.786201 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.787804 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.787988 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.788206 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.788315 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.788412 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.788563 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.789302 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.789942 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9t64v"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.790041 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.790348 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.790530 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.790711 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.791051 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.791437 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.791531 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.791596 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.791825 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.792655 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-smmgb"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.792842 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.793283 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.793641 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.793809 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.805639 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.823003 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.823585 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.823766 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.824102 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.852232 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.854796 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.854867 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.855052 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.855990 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.856168 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.856393 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.826775 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-77x5j"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.857499 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.857877 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6ld9l"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.862664 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.867391 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.868500 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869645 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869694 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc1eca03-6063-41fd-bcab-799db27b8f23-audit-dir\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869723 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869752 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d82cfc5-b2ed-4877-b4ef-e0f55b340fef-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vd4m7\" (UID: \"8d82cfc5-b2ed-4877-b4ef-e0f55b340fef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869778 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869802 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc1eca03-6063-41fd-bcab-799db27b8f23-audit-policies\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869828 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc1eca03-6063-41fd-bcab-799db27b8f23-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869852 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-client-ca\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869878 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35e2909f-4827-4ad7-a8f2-b464f982967f-serving-cert\") pod \"route-controller-manager-6576b87f9c-s8hhh\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869900 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9-machine-approver-tls\") pod \"machine-approver-56656f9798-4jdgk\" (UID: \"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869923 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9-config\") pod \"machine-approver-56656f9798-4jdgk\" (UID: \"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869942 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftsnz\" (UniqueName: \"kubernetes.io/projected/8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9-kube-api-access-ftsnz\") pod \"machine-approver-56656f9798-4jdgk\" (UID: \"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869968 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll6d5\" (UniqueName: \"kubernetes.io/projected/45c674d2-b259-4f0d-a92f-5a906aa07392-kube-api-access-ll6d5\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.869992 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9ac42c-b34a-4fff-b611-176fe523d337-config\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.870013 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e2909f-4827-4ad7-a8f2-b464f982967f-config\") pod \"route-controller-manager-6576b87f9c-s8hhh\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.870035 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dc1eca03-6063-41fd-bcab-799db27b8f23-encryption-config\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.870059 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.870304 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.870710 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.870765 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-audit-dir\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.870836 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc1eca03-6063-41fd-bcab-799db27b8f23-audit-dir\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.870696 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-audit-dir\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.870925 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7rgv\" (UniqueName: \"kubernetes.io/projected/8d82cfc5-b2ed-4877-b4ef-e0f55b340fef-kube-api-access-v7rgv\") pod \"openshift-apiserver-operator-796bbdcf4f-vd4m7\" (UID: \"8d82cfc5-b2ed-4877-b4ef-e0f55b340fef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.870963 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkvpt\" (UniqueName: \"kubernetes.io/projected/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-kube-api-access-mkvpt\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.870994 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45c674d2-b259-4f0d-a92f-5a906aa07392-serving-cert\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871022 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d82cfc5-b2ed-4877-b4ef-e0f55b340fef-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vd4m7\" (UID: \"8d82cfc5-b2ed-4877-b4ef-e0f55b340fef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871045 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftt59\" (UniqueName: \"kubernetes.io/projected/ba04adcf-e228-4486-a74a-e9846bdaa53f-kube-api-access-ftt59\") pod \"machine-api-operator-5694c8668f-9t64v\" (UID: \"ba04adcf-e228-4486-a74a-e9846bdaa53f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871065 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871108 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2htq2\" (UniqueName: \"kubernetes.io/projected/35921a00-2137-41b1-b509-774ea0622ba3-kube-api-access-2htq2\") pod \"openshift-controller-manager-operator-756b6f6bc6-d8g2p\" (UID: \"35921a00-2137-41b1-b509-774ea0622ba3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871138 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-audit-policies\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871160 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871184 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d9ac42c-b34a-4fff-b611-176fe523d337-service-ca-bundle\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871210 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35e2909f-4827-4ad7-a8f2-b464f982967f-client-ca\") pod \"route-controller-manager-6576b87f9c-s8hhh\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871247 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9-auth-proxy-config\") pod \"machine-approver-56656f9798-4jdgk\" (UID: \"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871274 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d9ac42c-b34a-4fff-b611-176fe523d337-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871297 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkftq\" (UniqueName: \"kubernetes.io/projected/3d9ac42c-b34a-4fff-b611-176fe523d337-kube-api-access-mkftq\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-config\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871349 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871374 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-node-pullsecrets\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871411 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc1eca03-6063-41fd-bcab-799db27b8f23-etcd-client\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871443 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871470 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871498 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-encryption-config\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871522 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc1eca03-6063-41fd-bcab-799db27b8f23-serving-cert\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871547 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba04adcf-e228-4486-a74a-e9846bdaa53f-config\") pod \"machine-api-operator-5694c8668f-9t64v\" (UID: \"ba04adcf-e228-4486-a74a-e9846bdaa53f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871569 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ba04adcf-e228-4486-a74a-e9846bdaa53f-images\") pod \"machine-api-operator-5694c8668f-9t64v\" (UID: \"ba04adcf-e228-4486-a74a-e9846bdaa53f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871595 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871616 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-audit\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871638 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35921a00-2137-41b1-b509-774ea0622ba3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d8g2p\" (UID: \"35921a00-2137-41b1-b509-774ea0622ba3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871662 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eee77ab9-8247-4813-8137-d627a31c8840-audit-dir\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871693 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnssg\" (UniqueName: \"kubernetes.io/projected/dc1eca03-6063-41fd-bcab-799db27b8f23-kube-api-access-rnssg\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871712 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-etcd-serving-ca\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871735 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871759 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9ac42c-b34a-4fff-b611-176fe523d337-serving-cert\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871782 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-image-import-ca\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871802 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871835 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqdd\" (UniqueName: \"kubernetes.io/projected/35e2909f-4827-4ad7-a8f2-b464f982967f-kube-api-access-cxqdd\") pod \"route-controller-manager-6576b87f9c-s8hhh\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871857 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d6cx\" (UniqueName: \"kubernetes.io/projected/eee77ab9-8247-4813-8137-d627a31c8840-kube-api-access-6d6cx\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871897 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba04adcf-e228-4486-a74a-e9846bdaa53f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9t64v\" (UID: \"ba04adcf-e228-4486-a74a-e9846bdaa53f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871920 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871944 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dc1eca03-6063-41fd-bcab-799db27b8f23-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.871967 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-serving-cert\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.872002 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-etcd-client\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.872021 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35921a00-2137-41b1-b509-774ea0622ba3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d8g2p\" (UID: \"35921a00-2137-41b1-b509-774ea0622ba3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.872040 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-config\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.873285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d82cfc5-b2ed-4877-b4ef-e0f55b340fef-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vd4m7\" (UID: \"8d82cfc5-b2ed-4877-b4ef-e0f55b340fef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.873972 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d9ac42c-b34a-4fff-b611-176fe523d337-service-ca-bundle\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.874811 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35e2909f-4827-4ad7-a8f2-b464f982967f-client-ca\") pod \"route-controller-manager-6576b87f9c-s8hhh\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.875487 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9-auth-proxy-config\") pod \"machine-approver-56656f9798-4jdgk\" (UID: \"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.876313 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.876403 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d9ac42c-b34a-4fff-b611-176fe523d337-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.877032 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-config\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.877091 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-node-pullsecrets\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.879822 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9-config\") pod \"machine-approver-56656f9798-4jdgk\" (UID: \"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.880704 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.880832 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.881451 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9ac42c-b34a-4fff-b611-176fe523d337-config\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.881505 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc1eca03-6063-41fd-bcab-799db27b8f23-audit-policies\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.881878 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc1eca03-6063-41fd-bcab-799db27b8f23-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.881976 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-etcd-serving-ca\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.884484 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-77x5j" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.886109 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.887023 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jp5jl"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.887466 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.887920 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.888264 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-image-import-ca\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.888932 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.889752 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.890190 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.891472 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9rzh4"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.891682 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.891954 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6ld9l" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.892818 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.892855 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.892879 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.892902 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.893141 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.894565 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.894930 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dc1eca03-6063-41fd-bcab-799db27b8f23-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.894952 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.895464 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.895537 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.895484 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.896073 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9rzh4" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.896322 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.896655 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4kndh"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.897562 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e2909f-4827-4ad7-a8f2-b464f982967f-config\") pod \"route-controller-manager-6576b87f9c-s8hhh\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.897851 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.898265 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba04adcf-e228-4486-a74a-e9846bdaa53f-config\") pod \"machine-api-operator-5694c8668f-9t64v\" (UID: \"ba04adcf-e228-4486-a74a-e9846bdaa53f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.898295 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wxgjx"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.898807 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ba04adcf-e228-4486-a74a-e9846bdaa53f-images\") pod \"machine-api-operator-5694c8668f-9t64v\" (UID: \"ba04adcf-e228-4486-a74a-e9846bdaa53f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.898923 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.899232 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-audit\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.899346 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.901516 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.901932 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.929534 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc1eca03-6063-41fd-bcab-799db27b8f23-etcd-client\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.934529 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9-machine-approver-tls\") pod \"machine-approver-56656f9798-4jdgk\" (UID: \"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.935249 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc1eca03-6063-41fd-bcab-799db27b8f23-serving-cert\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.936601 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35e2909f-4827-4ad7-a8f2-b464f982967f-serving-cert\") pod \"route-controller-manager-6576b87f9c-s8hhh\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.937729 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba04adcf-e228-4486-a74a-e9846bdaa53f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9t64v\" (UID: \"ba04adcf-e228-4486-a74a-e9846bdaa53f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.938248 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-etcd-client\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.939122 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9ac42c-b34a-4fff-b611-176fe523d337-serving-cert\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.939864 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bxg9v"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.939985 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-encryption-config\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.941012 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d82cfc5-b2ed-4877-b4ef-e0f55b340fef-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vd4m7\" (UID: \"8d82cfc5-b2ed-4877-b4ef-e0f55b340fef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.942540 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.944619 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-b649c"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.945321 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.945395 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.946014 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.946372 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dc1eca03-6063-41fd-bcab-799db27b8f23-encryption-config\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.954187 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.955009 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.955242 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mvgrd"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.955625 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.956370 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.956440 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-serving-cert\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.956624 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.960583 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9rzh4"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.960617 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.960629 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.960640 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kdfsw"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.960649 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.960657 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.960707 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.960934 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.963513 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-smmgb"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.963553 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nfj6t"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.963564 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.965496 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.966753 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-77x5j"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.966777 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.968424 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.970580 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.971835 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6lpz6"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.972654 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974303 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974340 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-client-ca\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974364 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6d5\" (UniqueName: \"kubernetes.io/projected/45c674d2-b259-4f0d-a92f-5a906aa07392-kube-api-access-ll6d5\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974385 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7daba40-adba-48a1-a1c1-139a096354b2-serving-cert\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974412 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/72da7396-e722-4b51-9dbb-2f32b9f490b4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6ds8r\" (UID: \"72da7396-e722-4b51-9dbb-2f32b9f490b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974434 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/90118620-9ff2-42f8-a7d4-398df433f1e6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-65gm9\" (UID: \"90118620-9ff2-42f8-a7d4-398df433f1e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974455 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974472 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-oauth-serving-cert\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974490 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90118620-9ff2-42f8-a7d4-398df433f1e6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-65gm9\" (UID: \"90118620-9ff2-42f8-a7d4-398df433f1e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974506 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-trusted-ca-bundle\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974526 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45c674d2-b259-4f0d-a92f-5a906aa07392-serving-cert\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974542 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-service-ca\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974566 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5tbr\" (UniqueName: \"kubernetes.io/projected/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-kube-api-access-x5tbr\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974589 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974605 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2htq2\" (UniqueName: \"kubernetes.io/projected/35921a00-2137-41b1-b509-774ea0622ba3-kube-api-access-2htq2\") pod \"openshift-controller-manager-operator-756b6f6bc6-d8g2p\" (UID: \"35921a00-2137-41b1-b509-774ea0622ba3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974620 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-audit-policies\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974634 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974661 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f7daba40-adba-48a1-a1c1-139a096354b2-etcd-client\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974694 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fxnh\" (UniqueName: \"kubernetes.io/projected/f7daba40-adba-48a1-a1c1-139a096354b2-kube-api-access-7fxnh\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3de82a84-a13d-4bf9-a242-2365193a9d62-trusted-ca\") pod \"ingress-operator-5b745b69d9-67r2l\" (UID: \"3de82a84-a13d-4bf9-a242-2365193a9d62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974729 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974744 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-config\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974761 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7daba40-adba-48a1-a1c1-139a096354b2-config\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974777 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90118620-9ff2-42f8-a7d4-398df433f1e6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-65gm9\" (UID: \"90118620-9ff2-42f8-a7d4-398df433f1e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974803 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1416b738-4058-4e5b-be77-62d0b9172a13-serving-cert\") pod \"console-operator-58897d9998-kdfsw\" (UID: \"1416b738-4058-4e5b-be77-62d0b9172a13\") " pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974817 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1416b738-4058-4e5b-be77-62d0b9172a13-trusted-ca\") pod \"console-operator-58897d9998-kdfsw\" (UID: \"1416b738-4058-4e5b-be77-62d0b9172a13\") " pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974835 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974851 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f7daba40-adba-48a1-a1c1-139a096354b2-etcd-ca\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974869 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf7e2512-6897-4c4d-b28f-c79deecee58b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6vx9t\" (UID: \"cf7e2512-6897-4c4d-b28f-c79deecee58b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974889 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974906 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c4035af-26c4-4c7d-85f6-30fa7b167a43-metrics-tls\") pod \"dns-operator-744455d44c-77x5j\" (UID: \"9c4035af-26c4-4c7d-85f6-30fa7b167a43\") " pod="openshift-dns-operator/dns-operator-744455d44c-77x5j" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974924 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzdqj\" (UniqueName: \"kubernetes.io/projected/3de82a84-a13d-4bf9-a242-2365193a9d62-kube-api-access-lzdqj\") pod \"ingress-operator-5b745b69d9-67r2l\" (UID: \"3de82a84-a13d-4bf9-a242-2365193a9d62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974941 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx95w\" (UniqueName: \"kubernetes.io/projected/5aed9e9a-0eff-476d-bf5c-e268f4e16e06-kube-api-access-qx95w\") pod \"downloads-7954f5f757-r2gmn\" (UID: \"5aed9e9a-0eff-476d-bf5c-e268f4e16e06\") " pod="openshift-console/downloads-7954f5f757-r2gmn" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974959 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b64270c-ff7e-4ad1-ad73-98c5e14346e0-config\") pod \"kube-controller-manager-operator-78b949d7b-plkvk\" (UID: \"9b64270c-ff7e-4ad1-ad73-98c5e14346e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974977 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z894w\" (UniqueName: \"kubernetes.io/projected/cf7e2512-6897-4c4d-b28f-c79deecee58b-kube-api-access-z894w\") pod \"cluster-samples-operator-665b6dd947-6vx9t\" (UID: \"cf7e2512-6897-4c4d-b28f-c79deecee58b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.974997 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975015 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-oauth-config\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975033 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35921a00-2137-41b1-b509-774ea0622ba3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d8g2p\" (UID: \"35921a00-2137-41b1-b509-774ea0622ba3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975050 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eee77ab9-8247-4813-8137-d627a31c8840-audit-dir\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975069 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7wsq\" (UniqueName: \"kubernetes.io/projected/1416b738-4058-4e5b-be77-62d0b9172a13-kube-api-access-l7wsq\") pod \"console-operator-58897d9998-kdfsw\" (UID: \"1416b738-4058-4e5b-be77-62d0b9172a13\") " pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975101 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-serving-cert\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975124 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjl9z\" (UniqueName: \"kubernetes.io/projected/72da7396-e722-4b51-9dbb-2f32b9f490b4-kube-api-access-rjl9z\") pod \"openshift-config-operator-7777fb866f-6ds8r\" (UID: \"72da7396-e722-4b51-9dbb-2f32b9f490b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975140 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnfcc\" (UniqueName: \"kubernetes.io/projected/9c4035af-26c4-4c7d-85f6-30fa7b167a43-kube-api-access-wnfcc\") pod \"dns-operator-744455d44c-77x5j\" (UID: \"9c4035af-26c4-4c7d-85f6-30fa7b167a43\") " pod="openshift-dns-operator/dns-operator-744455d44c-77x5j" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975155 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwq7p\" (UniqueName: \"kubernetes.io/projected/61c25e66-efcf-43a0-bcc4-922e00481359-kube-api-access-rwq7p\") pod \"migrator-59844c95c7-6ld9l\" (UID: \"61c25e66-efcf-43a0-bcc4-922e00481359\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6ld9l" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975176 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975193 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f7daba40-adba-48a1-a1c1-139a096354b2-etcd-service-ca\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975217 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975261 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d6cx\" (UniqueName: \"kubernetes.io/projected/eee77ab9-8247-4813-8137-d627a31c8840-kube-api-access-6d6cx\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975280 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b64270c-ff7e-4ad1-ad73-98c5e14346e0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-plkvk\" (UID: \"9b64270c-ff7e-4ad1-ad73-98c5e14346e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975306 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975321 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72da7396-e722-4b51-9dbb-2f32b9f490b4-serving-cert\") pod \"openshift-config-operator-7777fb866f-6ds8r\" (UID: \"72da7396-e722-4b51-9dbb-2f32b9f490b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975338 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b64270c-ff7e-4ad1-ad73-98c5e14346e0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-plkvk\" (UID: \"9b64270c-ff7e-4ad1-ad73-98c5e14346e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975363 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3de82a84-a13d-4bf9-a242-2365193a9d62-bound-sa-token\") pod \"ingress-operator-5b745b69d9-67r2l\" (UID: \"3de82a84-a13d-4bf9-a242-2365193a9d62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975381 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-config\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975398 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35921a00-2137-41b1-b509-774ea0622ba3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d8g2p\" (UID: \"35921a00-2137-41b1-b509-774ea0622ba3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975416 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qrbp\" (UniqueName: \"kubernetes.io/projected/90118620-9ff2-42f8-a7d4-398df433f1e6-kube-api-access-4qrbp\") pod \"cluster-image-registry-operator-dc59b4c8b-65gm9\" (UID: \"90118620-9ff2-42f8-a7d4-398df433f1e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975433 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975450 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1416b738-4058-4e5b-be77-62d0b9172a13-config\") pod \"console-operator-58897d9998-kdfsw\" (UID: \"1416b738-4058-4e5b-be77-62d0b9172a13\") " pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.975465 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3de82a84-a13d-4bf9-a242-2365193a9d62-metrics-tls\") pod \"ingress-operator-5b745b69d9-67r2l\" (UID: \"3de82a84-a13d-4bf9-a242-2365193a9d62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.976051 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.976296 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-client-ca\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.977001 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.977587 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35921a00-2137-41b1-b509-774ea0622ba3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d8g2p\" (UID: \"35921a00-2137-41b1-b509-774ea0622ba3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.978000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-audit-policies\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.978122 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.978516 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-config\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.978807 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.979000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eee77ab9-8247-4813-8137-d627a31c8840-audit-dir\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.980668 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.981130 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.983390 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.983500 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.984125 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.984166 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.984360 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35921a00-2137-41b1-b509-774ea0622ba3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d8g2p\" (UID: \"35921a00-2137-41b1-b509-774ea0622ba3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.984670 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45c674d2-b259-4f0d-a92f-5a906aa07392-serving-cert\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.985633 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.986250 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.987994 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.990855 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-r2gmn"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.990909 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-fbqb2"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.991046 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6ld9l"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.993282 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.993540 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.993618 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.993639 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.994548 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.995915 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.996701 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bxg9v"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.998066 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.998888 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg"] Feb 17 14:54:36 crc kubenswrapper[4717]: I0217 14:54:36.999931 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xfb64"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.001114 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.002344 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4kndh"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.003551 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-b649c"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.005231 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-djfvl"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.005899 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-djfvl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.006012 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wxgjx"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.007188 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mvgrd"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.008089 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.009021 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.010151 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nqkpb"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.010885 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nqkpb" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.011209 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nqkpb"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.011975 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-whb99"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.012573 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.013495 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-whb99"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.013569 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-whb99" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.032642 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.072206 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076379 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f7daba40-adba-48a1-a1c1-139a096354b2-etcd-client\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076433 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a18b1991-622b-438d-b168-91e5a21ad0f0-webhook-cert\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076468 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96493978-ef25-4f82-ab8f-29c966a22ac6-metrics-certs\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076495 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7daba40-adba-48a1-a1c1-139a096354b2-config\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076513 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90118620-9ff2-42f8-a7d4-398df433f1e6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-65gm9\" (UID: \"90118620-9ff2-42f8-a7d4-398df433f1e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076534 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f7daba40-adba-48a1-a1c1-139a096354b2-etcd-ca\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076550 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b10850bc-6877-4ad3-bda3-d9307748b7e2-signing-key\") pod \"service-ca-9c57cc56f-b649c\" (UID: \"b10850bc-6877-4ad3-bda3-d9307748b7e2\") " pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076567 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-plugins-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076585 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/96493978-ef25-4f82-ab8f-29c966a22ac6-default-certificate\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076601 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-socket-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076618 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qw6z\" (UniqueName: \"kubernetes.io/projected/5ab2c5e9-8f67-4615-b1a0-60fa635a87fc-kube-api-access-7qw6z\") pod \"machine-config-operator-74547568cd-dlhxh\" (UID: \"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076646 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx95w\" (UniqueName: \"kubernetes.io/projected/5aed9e9a-0eff-476d-bf5c-e268f4e16e06-kube-api-access-qx95w\") pod \"downloads-7954f5f757-r2gmn\" (UID: \"5aed9e9a-0eff-476d-bf5c-e268f4e16e06\") " pod="openshift-console/downloads-7954f5f757-r2gmn" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076662 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b64270c-ff7e-4ad1-ad73-98c5e14346e0-config\") pod \"kube-controller-manager-operator-78b949d7b-plkvk\" (UID: \"9b64270c-ff7e-4ad1-ad73-98c5e14346e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076681 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a18b1991-622b-438d-b168-91e5a21ad0f0-tmpfs\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076697 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k78mt\" (UniqueName: \"kubernetes.io/projected/057d447b-4684-4f9c-b3e3-357677289cb5-kube-api-access-k78mt\") pod \"package-server-manager-789f6589d5-rk4xx\" (UID: \"057d447b-4684-4f9c-b3e3-357677289cb5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076713 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bxg9v\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076749 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ab2c5e9-8f67-4615-b1a0-60fa635a87fc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-dlhxh\" (UID: \"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076766 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96493978-ef25-4f82-ab8f-29c966a22ac6-service-ca-bundle\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076801 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjl9z\" (UniqueName: \"kubernetes.io/projected/72da7396-e722-4b51-9dbb-2f32b9f490b4-kube-api-access-rjl9z\") pod \"openshift-config-operator-7777fb866f-6ds8r\" (UID: \"72da7396-e722-4b51-9dbb-2f32b9f490b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076817 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnfcc\" (UniqueName: \"kubernetes.io/projected/9c4035af-26c4-4c7d-85f6-30fa7b167a43-kube-api-access-wnfcc\") pod \"dns-operator-744455d44c-77x5j\" (UID: \"9c4035af-26c4-4c7d-85f6-30fa7b167a43\") " pod="openshift-dns-operator/dns-operator-744455d44c-77x5j" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076834 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwq7p\" (UniqueName: \"kubernetes.io/projected/61c25e66-efcf-43a0-bcc4-922e00481359-kube-api-access-rwq7p\") pod \"migrator-59844c95c7-6ld9l\" (UID: \"61c25e66-efcf-43a0-bcc4-922e00481359\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6ld9l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076880 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b64270c-ff7e-4ad1-ad73-98c5e14346e0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-plkvk\" (UID: \"9b64270c-ff7e-4ad1-ad73-98c5e14346e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076898 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/53a28e17-d55d-4208-a23f-1254796e789f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gch26\" (UID: \"53a28e17-d55d-4208-a23f-1254796e789f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076915 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fc864b6-ff33-49d1-bed4-c1e32a47c747-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mnnjg\" (UID: \"1fc864b6-ff33-49d1-bed4-c1e32a47c747\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076938 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72da7396-e722-4b51-9dbb-2f32b9f490b4-serving-cert\") pod \"openshift-config-operator-7777fb866f-6ds8r\" (UID: \"72da7396-e722-4b51-9dbb-2f32b9f490b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076954 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-secret-volume\") pod \"collect-profiles-29522325-45ck6\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076970 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a18b1991-622b-438d-b168-91e5a21ad0f0-apiservice-cert\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.076987 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9rzh4\" (UID: \"8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9rzh4" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.077012 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4mtb\" (UniqueName: \"kubernetes.io/projected/efd2934c-3d71-447e-bf59-2fd6a1dd6966-kube-api-access-x4mtb\") pod \"service-ca-operator-777779d784-4kndh\" (UID: \"efd2934c-3d71-447e-bf59-2fd6a1dd6966\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.077028 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/057d447b-4684-4f9c-b3e3-357677289cb5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rk4xx\" (UID: \"057d447b-4684-4f9c-b3e3-357677289cb5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.077052 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efd2934c-3d71-447e-bf59-2fd6a1dd6966-config\") pod \"service-ca-operator-777779d784-4kndh\" (UID: \"efd2934c-3d71-447e-bf59-2fd6a1dd6966\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.077068 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qrbp\" (UniqueName: \"kubernetes.io/projected/90118620-9ff2-42f8-a7d4-398df433f1e6-kube-api-access-4qrbp\") pod \"cluster-image-registry-operator-dc59b4c8b-65gm9\" (UID: \"90118620-9ff2-42f8-a7d4-398df433f1e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.077113 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1416b738-4058-4e5b-be77-62d0b9172a13-config\") pod \"console-operator-58897d9998-kdfsw\" (UID: \"1416b738-4058-4e5b-be77-62d0b9172a13\") " pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.077139 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wg5b\" (UniqueName: \"kubernetes.io/projected/f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8-kube-api-access-7wg5b\") pod \"catalog-operator-68c6474976-7268l\" (UID: \"f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.077161 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwmqn\" (UniqueName: \"kubernetes.io/projected/53a28e17-d55d-4208-a23f-1254796e789f-kube-api-access-kwmqn\") pod \"olm-operator-6b444d44fb-gch26\" (UID: \"53a28e17-d55d-4208-a23f-1254796e789f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.077180 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-config-volume\") pod \"collect-profiles-29522325-45ck6\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.077202 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b10850bc-6877-4ad3-bda3-d9307748b7e2-signing-cabundle\") pod \"service-ca-9c57cc56f-b649c\" (UID: \"b10850bc-6877-4ad3-bda3-d9307748b7e2\") " pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.077216 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efd2934c-3d71-447e-bf59-2fd6a1dd6966-serving-cert\") pod \"service-ca-operator-777779d784-4kndh\" (UID: \"efd2934c-3d71-447e-bf59-2fd6a1dd6966\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.078782 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1416b738-4058-4e5b-be77-62d0b9172a13-config\") pod \"console-operator-58897d9998-kdfsw\" (UID: \"1416b738-4058-4e5b-be77-62d0b9172a13\") " pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.078861 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/72da7396-e722-4b51-9dbb-2f32b9f490b4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6ds8r\" (UID: \"72da7396-e722-4b51-9dbb-2f32b9f490b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.078923 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-oauth-serving-cert\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.079305 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0597617d-4007-42f7-97b2-660584569977-config\") pod \"kube-apiserver-operator-766d6c64bb-6hm8n\" (UID: \"0597617d-4007-42f7-97b2-660584569977\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.079590 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f7daba40-adba-48a1-a1c1-139a096354b2-etcd-client\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.079676 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f7daba40-adba-48a1-a1c1-139a096354b2-etcd-ca\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.080243 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8-profile-collector-cert\") pod \"catalog-operator-68c6474976-7268l\" (UID: \"f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.080598 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-trusted-ca-bundle\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.080877 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90118620-9ff2-42f8-a7d4-398df433f1e6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-65gm9\" (UID: \"90118620-9ff2-42f8-a7d4-398df433f1e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.080914 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fft6\" (UniqueName: \"kubernetes.io/projected/1fc864b6-ff33-49d1-bed4-c1e32a47c747-kube-api-access-2fft6\") pod \"machine-config-controller-84d6567774-mnnjg\" (UID: \"1fc864b6-ff33-49d1-bed4-c1e32a47c747\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.081225 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-oauth-serving-cert\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.081555 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/72da7396-e722-4b51-9dbb-2f32b9f490b4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6ds8r\" (UID: \"72da7396-e722-4b51-9dbb-2f32b9f490b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.081842 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-service-ca\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.082030 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bxg9v\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.082146 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5tbr\" (UniqueName: \"kubernetes.io/projected/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-kube-api-access-x5tbr\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.082284 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0354b01-9ce9-45bf-bddf-d74b45f815cf-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hd2cz\" (UID: \"a0354b01-9ce9-45bf-bddf-d74b45f815cf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.082341 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8-srv-cert\") pod \"catalog-operator-68c6474976-7268l\" (UID: \"f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.082388 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-csi-data-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.082601 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fxnh\" (UniqueName: \"kubernetes.io/projected/f7daba40-adba-48a1-a1c1-139a096354b2-kube-api-access-7fxnh\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.082770 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3de82a84-a13d-4bf9-a242-2365193a9d62-trusted-ca\") pod \"ingress-operator-5b745b69d9-67r2l\" (UID: \"3de82a84-a13d-4bf9-a242-2365193a9d62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.082838 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-config\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.082904 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1416b738-4058-4e5b-be77-62d0b9172a13-trusted-ca\") pod \"console-operator-58897d9998-kdfsw\" (UID: \"1416b738-4058-4e5b-be77-62d0b9172a13\") " pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.082988 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1416b738-4058-4e5b-be77-62d0b9172a13-serving-cert\") pod \"console-operator-58897d9998-kdfsw\" (UID: \"1416b738-4058-4e5b-be77-62d0b9172a13\") " pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.083018 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/96493978-ef25-4f82-ab8f-29c966a22ac6-stats-auth\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.083299 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-service-ca\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.083432 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0597617d-4007-42f7-97b2-660584569977-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6hm8n\" (UID: \"0597617d-4007-42f7-97b2-660584569977\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.083475 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf7e2512-6897-4c4d-b28f-c79deecee58b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6vx9t\" (UID: \"cf7e2512-6897-4c4d-b28f-c79deecee58b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.083501 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fc864b6-ff33-49d1-bed4-c1e32a47c747-proxy-tls\") pod \"machine-config-controller-84d6567774-mnnjg\" (UID: \"1fc864b6-ff33-49d1-bed4-c1e32a47c747\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.083528 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvbt8\" (UniqueName: \"kubernetes.io/projected/8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a-kube-api-access-dvbt8\") pod \"multus-admission-controller-857f4d67dd-9rzh4\" (UID: \"8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9rzh4" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.083826 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0354b01-9ce9-45bf-bddf-d74b45f815cf-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hd2cz\" (UID: \"a0354b01-9ce9-45bf-bddf-d74b45f815cf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.083860 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c4035af-26c4-4c7d-85f6-30fa7b167a43-metrics-tls\") pod \"dns-operator-744455d44c-77x5j\" (UID: \"9c4035af-26c4-4c7d-85f6-30fa7b167a43\") " pod="openshift-dns-operator/dns-operator-744455d44c-77x5j" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.083880 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzdqj\" (UniqueName: \"kubernetes.io/projected/3de82a84-a13d-4bf9-a242-2365193a9d62-kube-api-access-lzdqj\") pod \"ingress-operator-5b745b69d9-67r2l\" (UID: \"3de82a84-a13d-4bf9-a242-2365193a9d62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.084124 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z894w\" (UniqueName: \"kubernetes.io/projected/cf7e2512-6897-4c4d-b28f-c79deecee58b-kube-api-access-z894w\") pod \"cluster-samples-operator-665b6dd947-6vx9t\" (UID: \"cf7e2512-6897-4c4d-b28f-c79deecee58b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.084169 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0597617d-4007-42f7-97b2-660584569977-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6hm8n\" (UID: \"0597617d-4007-42f7-97b2-660584569977\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.084438 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-oauth-config\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.085029 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-config\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.085040 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1416b738-4058-4e5b-be77-62d0b9172a13-trusted-ca\") pod \"console-operator-58897d9998-kdfsw\" (UID: \"1416b738-4058-4e5b-be77-62d0b9172a13\") " pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.085189 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90118620-9ff2-42f8-a7d4-398df433f1e6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-65gm9\" (UID: \"90118620-9ff2-42f8-a7d4-398df433f1e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.085391 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7wsq\" (UniqueName: \"kubernetes.io/projected/1416b738-4058-4e5b-be77-62d0b9172a13-kube-api-access-l7wsq\") pod \"console-operator-58897d9998-kdfsw\" (UID: \"1416b738-4058-4e5b-be77-62d0b9172a13\") " pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.087505 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3de82a84-a13d-4bf9-a242-2365193a9d62-trusted-ca\") pod \"ingress-operator-5b745b69d9-67r2l\" (UID: \"3de82a84-a13d-4bf9-a242-2365193a9d62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.087561 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-serving-cert\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.087623 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7bhq\" (UniqueName: \"kubernetes.io/projected/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-kube-api-access-n7bhq\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.087664 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-mountpoint-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.087723 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f7daba40-adba-48a1-a1c1-139a096354b2-etcd-service-ca\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.087780 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b45x2\" (UniqueName: \"kubernetes.io/projected/b10850bc-6877-4ad3-bda3-d9307748b7e2-kube-api-access-b45x2\") pod \"service-ca-9c57cc56f-b649c\" (UID: \"b10850bc-6877-4ad3-bda3-d9307748b7e2\") " pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.087821 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0354b01-9ce9-45bf-bddf-d74b45f815cf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hd2cz\" (UID: \"a0354b01-9ce9-45bf-bddf-d74b45f815cf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.087878 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b64270c-ff7e-4ad1-ad73-98c5e14346e0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-plkvk\" (UID: \"9b64270c-ff7e-4ad1-ad73-98c5e14346e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.087912 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjh5c\" (UniqueName: \"kubernetes.io/projected/a18b1991-622b-438d-b168-91e5a21ad0f0-kube-api-access-zjh5c\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.087941 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ab2c5e9-8f67-4615-b1a0-60fa635a87fc-proxy-tls\") pod \"machine-config-operator-74547568cd-dlhxh\" (UID: \"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.087978 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3de82a84-a13d-4bf9-a242-2365193a9d62-bound-sa-token\") pod \"ingress-operator-5b745b69d9-67r2l\" (UID: \"3de82a84-a13d-4bf9-a242-2365193a9d62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.088028 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53a28e17-d55d-4208-a23f-1254796e789f-srv-cert\") pod \"olm-operator-6b444d44fb-gch26\" (UID: \"53a28e17-d55d-4208-a23f-1254796e789f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.088126 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5ab2c5e9-8f67-4615-b1a0-60fa635a87fc-images\") pod \"machine-config-operator-74547568cd-dlhxh\" (UID: \"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.088162 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3de82a84-a13d-4bf9-a242-2365193a9d62-metrics-tls\") pod \"ingress-operator-5b745b69d9-67r2l\" (UID: \"3de82a84-a13d-4bf9-a242-2365193a9d62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.088224 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkpwb\" (UniqueName: \"kubernetes.io/projected/2fbfa975-4e24-4213-8368-ba1af6b39e21-kube-api-access-bkpwb\") pod \"marketplace-operator-79b997595-bxg9v\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.088421 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7daba40-adba-48a1-a1c1-139a096354b2-serving-cert\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.088897 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-oauth-config\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.089072 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/90118620-9ff2-42f8-a7d4-398df433f1e6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-65gm9\" (UID: \"90118620-9ff2-42f8-a7d4-398df433f1e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.089099 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1416b738-4058-4e5b-be77-62d0b9172a13-serving-cert\") pod \"console-operator-58897d9998-kdfsw\" (UID: \"1416b738-4058-4e5b-be77-62d0b9172a13\") " pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.089166 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-registration-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.089199 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qljlc\" (UniqueName: \"kubernetes.io/projected/96493978-ef25-4f82-ab8f-29c966a22ac6-kube-api-access-qljlc\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.089219 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6kg8\" (UniqueName: \"kubernetes.io/projected/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-kube-api-access-r6kg8\") pod \"collect-profiles-29522325-45ck6\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.090006 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf7e2512-6897-4c4d-b28f-c79deecee58b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6vx9t\" (UID: \"cf7e2512-6897-4c4d-b28f-c79deecee58b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.091666 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-trusted-ca-bundle\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.091917 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3de82a84-a13d-4bf9-a242-2365193a9d62-metrics-tls\") pod \"ingress-operator-5b745b69d9-67r2l\" (UID: \"3de82a84-a13d-4bf9-a242-2365193a9d62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.092421 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72da7396-e722-4b51-9dbb-2f32b9f490b4-serving-cert\") pod \"openshift-config-operator-7777fb866f-6ds8r\" (UID: \"72da7396-e722-4b51-9dbb-2f32b9f490b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.092919 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.093136 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/90118620-9ff2-42f8-a7d4-398df433f1e6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-65gm9\" (UID: \"90118620-9ff2-42f8-a7d4-398df433f1e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.094808 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-serving-cert\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.095163 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7daba40-adba-48a1-a1c1-139a096354b2-serving-cert\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.101698 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7daba40-adba-48a1-a1c1-139a096354b2-config\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.112493 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.118496 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f7daba40-adba-48a1-a1c1-139a096354b2-etcd-service-ca\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.132275 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.152532 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.173163 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.181490 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b64270c-ff7e-4ad1-ad73-98c5e14346e0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-plkvk\" (UID: \"9b64270c-ff7e-4ad1-ad73-98c5e14346e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190312 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b10850bc-6877-4ad3-bda3-d9307748b7e2-signing-key\") pod \"service-ca-9c57cc56f-b649c\" (UID: \"b10850bc-6877-4ad3-bda3-d9307748b7e2\") " pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190365 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-plugins-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190397 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/96493978-ef25-4f82-ab8f-29c966a22ac6-default-certificate\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190429 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qw6z\" (UniqueName: \"kubernetes.io/projected/5ab2c5e9-8f67-4615-b1a0-60fa635a87fc-kube-api-access-7qw6z\") pod \"machine-config-operator-74547568cd-dlhxh\" (UID: \"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190455 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-socket-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190491 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a18b1991-622b-438d-b168-91e5a21ad0f0-tmpfs\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190520 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k78mt\" (UniqueName: \"kubernetes.io/projected/057d447b-4684-4f9c-b3e3-357677289cb5-kube-api-access-k78mt\") pod \"package-server-manager-789f6589d5-rk4xx\" (UID: \"057d447b-4684-4f9c-b3e3-357677289cb5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190554 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96493978-ef25-4f82-ab8f-29c966a22ac6-service-ca-bundle\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190581 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bxg9v\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190603 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ab2c5e9-8f67-4615-b1a0-60fa635a87fc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-dlhxh\" (UID: \"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190705 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/53a28e17-d55d-4208-a23f-1254796e789f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gch26\" (UID: \"53a28e17-d55d-4208-a23f-1254796e789f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190743 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fc864b6-ff33-49d1-bed4-c1e32a47c747-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mnnjg\" (UID: \"1fc864b6-ff33-49d1-bed4-c1e32a47c747\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190770 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-secret-volume\") pod \"collect-profiles-29522325-45ck6\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190792 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a18b1991-622b-438d-b168-91e5a21ad0f0-apiservice-cert\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190818 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9rzh4\" (UID: \"8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9rzh4" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190842 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4mtb\" (UniqueName: \"kubernetes.io/projected/efd2934c-3d71-447e-bf59-2fd6a1dd6966-kube-api-access-x4mtb\") pod \"service-ca-operator-777779d784-4kndh\" (UID: \"efd2934c-3d71-447e-bf59-2fd6a1dd6966\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190870 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/057d447b-4684-4f9c-b3e3-357677289cb5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rk4xx\" (UID: \"057d447b-4684-4f9c-b3e3-357677289cb5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190900 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efd2934c-3d71-447e-bf59-2fd6a1dd6966-config\") pod \"service-ca-operator-777779d784-4kndh\" (UID: \"efd2934c-3d71-447e-bf59-2fd6a1dd6966\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190934 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wg5b\" (UniqueName: \"kubernetes.io/projected/f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8-kube-api-access-7wg5b\") pod \"catalog-operator-68c6474976-7268l\" (UID: \"f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190955 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwmqn\" (UniqueName: \"kubernetes.io/projected/53a28e17-d55d-4208-a23f-1254796e789f-kube-api-access-kwmqn\") pod \"olm-operator-6b444d44fb-gch26\" (UID: \"53a28e17-d55d-4208-a23f-1254796e789f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.190976 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-config-volume\") pod \"collect-profiles-29522325-45ck6\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191001 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b10850bc-6877-4ad3-bda3-d9307748b7e2-signing-cabundle\") pod \"service-ca-9c57cc56f-b649c\" (UID: \"b10850bc-6877-4ad3-bda3-d9307748b7e2\") " pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191020 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efd2934c-3d71-447e-bf59-2fd6a1dd6966-serving-cert\") pod \"service-ca-operator-777779d784-4kndh\" (UID: \"efd2934c-3d71-447e-bf59-2fd6a1dd6966\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191038 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8-profile-collector-cert\") pod \"catalog-operator-68c6474976-7268l\" (UID: \"f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191059 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0597617d-4007-42f7-97b2-660584569977-config\") pod \"kube-apiserver-operator-766d6c64bb-6hm8n\" (UID: \"0597617d-4007-42f7-97b2-660584569977\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191104 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fft6\" (UniqueName: \"kubernetes.io/projected/1fc864b6-ff33-49d1-bed4-c1e32a47c747-kube-api-access-2fft6\") pod \"machine-config-controller-84d6567774-mnnjg\" (UID: \"1fc864b6-ff33-49d1-bed4-c1e32a47c747\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191073 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-socket-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191129 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bxg9v\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191189 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0354b01-9ce9-45bf-bddf-d74b45f815cf-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hd2cz\" (UID: \"a0354b01-9ce9-45bf-bddf-d74b45f815cf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191218 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8-srv-cert\") pod \"catalog-operator-68c6474976-7268l\" (UID: \"f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191241 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-csi-data-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191275 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0597617d-4007-42f7-97b2-660584569977-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6hm8n\" (UID: \"0597617d-4007-42f7-97b2-660584569977\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191296 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/96493978-ef25-4f82-ab8f-29c966a22ac6-stats-auth\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191319 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fc864b6-ff33-49d1-bed4-c1e32a47c747-proxy-tls\") pod \"machine-config-controller-84d6567774-mnnjg\" (UID: \"1fc864b6-ff33-49d1-bed4-c1e32a47c747\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191341 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvbt8\" (UniqueName: \"kubernetes.io/projected/8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a-kube-api-access-dvbt8\") pod \"multus-admission-controller-857f4d67dd-9rzh4\" (UID: \"8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9rzh4" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191364 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0354b01-9ce9-45bf-bddf-d74b45f815cf-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hd2cz\" (UID: \"a0354b01-9ce9-45bf-bddf-d74b45f815cf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191413 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0597617d-4007-42f7-97b2-660584569977-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6hm8n\" (UID: \"0597617d-4007-42f7-97b2-660584569977\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191446 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7bhq\" (UniqueName: \"kubernetes.io/projected/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-kube-api-access-n7bhq\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191470 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-mountpoint-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191499 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b45x2\" (UniqueName: \"kubernetes.io/projected/b10850bc-6877-4ad3-bda3-d9307748b7e2-kube-api-access-b45x2\") pod \"service-ca-9c57cc56f-b649c\" (UID: \"b10850bc-6877-4ad3-bda3-d9307748b7e2\") " pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191525 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0354b01-9ce9-45bf-bddf-d74b45f815cf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hd2cz\" (UID: \"a0354b01-9ce9-45bf-bddf-d74b45f815cf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191552 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjh5c\" (UniqueName: \"kubernetes.io/projected/a18b1991-622b-438d-b168-91e5a21ad0f0-kube-api-access-zjh5c\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191577 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ab2c5e9-8f67-4615-b1a0-60fa635a87fc-proxy-tls\") pod \"machine-config-operator-74547568cd-dlhxh\" (UID: \"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191619 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53a28e17-d55d-4208-a23f-1254796e789f-srv-cert\") pod \"olm-operator-6b444d44fb-gch26\" (UID: \"53a28e17-d55d-4208-a23f-1254796e789f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191639 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5ab2c5e9-8f67-4615-b1a0-60fa635a87fc-images\") pod \"machine-config-operator-74547568cd-dlhxh\" (UID: \"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191664 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkpwb\" (UniqueName: \"kubernetes.io/projected/2fbfa975-4e24-4213-8368-ba1af6b39e21-kube-api-access-bkpwb\") pod \"marketplace-operator-79b997595-bxg9v\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191674 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a18b1991-622b-438d-b168-91e5a21ad0f0-tmpfs\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191770 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6kg8\" (UniqueName: \"kubernetes.io/projected/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-kube-api-access-r6kg8\") pod \"collect-profiles-29522325-45ck6\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191867 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-registration-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191921 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qljlc\" (UniqueName: \"kubernetes.io/projected/96493978-ef25-4f82-ab8f-29c966a22ac6-kube-api-access-qljlc\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191954 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a18b1991-622b-438d-b168-91e5a21ad0f0-webhook-cert\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.191979 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96493978-ef25-4f82-ab8f-29c966a22ac6-metrics-certs\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.192193 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-plugins-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.192461 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-csi-data-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.192847 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-registration-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.192969 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.193225 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-mountpoint-dir\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.193326 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ab2c5e9-8f67-4615-b1a0-60fa635a87fc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-dlhxh\" (UID: \"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.194179 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fc864b6-ff33-49d1-bed4-c1e32a47c747-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mnnjg\" (UID: \"1fc864b6-ff33-49d1-bed4-c1e32a47c747\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.200478 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b64270c-ff7e-4ad1-ad73-98c5e14346e0-config\") pod \"kube-controller-manager-operator-78b949d7b-plkvk\" (UID: \"9b64270c-ff7e-4ad1-ad73-98c5e14346e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.234793 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7rgv\" (UniqueName: \"kubernetes.io/projected/8d82cfc5-b2ed-4877-b4ef-e0f55b340fef-kube-api-access-v7rgv\") pod \"openshift-apiserver-operator-796bbdcf4f-vd4m7\" (UID: \"8d82cfc5-b2ed-4877-b4ef-e0f55b340fef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.248982 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkvpt\" (UniqueName: \"kubernetes.io/projected/44d6dc49-9b2a-408c-a83d-1d5baf26bba4-kube-api-access-mkvpt\") pod \"apiserver-76f77b778f-xfb64\" (UID: \"44d6dc49-9b2a-408c-a83d-1d5baf26bba4\") " pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.271715 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftt59\" (UniqueName: \"kubernetes.io/projected/ba04adcf-e228-4486-a74a-e9846bdaa53f-kube-api-access-ftt59\") pod \"machine-api-operator-5694c8668f-9t64v\" (UID: \"ba04adcf-e228-4486-a74a-e9846bdaa53f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.287074 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkftq\" (UniqueName: \"kubernetes.io/projected/3d9ac42c-b34a-4fff-b611-176fe523d337-kube-api-access-mkftq\") pod \"authentication-operator-69f744f599-549dm\" (UID: \"3d9ac42c-b34a-4fff-b611-176fe523d337\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.298648 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.309221 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftsnz\" (UniqueName: \"kubernetes.io/projected/8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9-kube-api-access-ftsnz\") pod \"machine-approver-56656f9798-4jdgk\" (UID: \"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.312950 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.315462 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.333363 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.354240 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.358912 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c4035af-26c4-4c7d-85f6-30fa7b167a43-metrics-tls\") pod \"dns-operator-744455d44c-77x5j\" (UID: \"9c4035af-26c4-4c7d-85f6-30fa7b167a43\") " pod="openshift-dns-operator/dns-operator-744455d44c-77x5j" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.359365 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.372703 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.386659 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.394847 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.409745 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fc864b6-ff33-49d1-bed4-c1e32a47c747-proxy-tls\") pod \"machine-config-controller-84d6567774-mnnjg\" (UID: \"1fc864b6-ff33-49d1-bed4-c1e32a47c747\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.412680 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.433407 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.455549 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.472810 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.473923 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5ab2c5e9-8f67-4615-b1a0-60fa635a87fc-images\") pod \"machine-config-operator-74547568cd-dlhxh\" (UID: \"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.500619 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.512380 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.533745 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.545680 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.553527 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.573169 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0597617d-4007-42f7-97b2-660584569977-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6hm8n\" (UID: \"0597617d-4007-42f7-97b2-660584569977\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.579015 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.580518 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xfb64"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.583930 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0597617d-4007-42f7-97b2-660584569977-config\") pod \"kube-apiserver-operator-766d6c64bb-6hm8n\" (UID: \"0597617d-4007-42f7-97b2-660584569977\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.592242 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.592624 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7"] Feb 17 14:54:37 crc kubenswrapper[4717]: W0217 14:54:37.595704 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44d6dc49_9b2a_408c_a83d_1d5baf26bba4.slice/crio-05ebe18194e903834f2a74339c6756f60fd89473be3af3668c728224eeebe512 WatchSource:0}: Error finding container 05ebe18194e903834f2a74339c6756f60fd89473be3af3668c728224eeebe512: Status 404 returned error can't find the container with id 05ebe18194e903834f2a74339c6756f60fd89473be3af3668c728224eeebe512 Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.612615 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.628734 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9t64v"] Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.633005 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.635181 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-549dm"] Feb 17 14:54:37 crc kubenswrapper[4717]: W0217 14:54:37.637793 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba04adcf_e228_4486_a74a_e9846bdaa53f.slice/crio-e97200a7b86305aea5594c2c4d8ebe2ebade85953a0507944e7ab83a745e72d9 WatchSource:0}: Error finding container e97200a7b86305aea5594c2c4d8ebe2ebade85953a0507944e7ab83a745e72d9: Status 404 returned error can't find the container with id e97200a7b86305aea5594c2c4d8ebe2ebade85953a0507944e7ab83a745e72d9 Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.638614 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ab2c5e9-8f67-4615-b1a0-60fa635a87fc-proxy-tls\") pod \"machine-config-operator-74547568cd-dlhxh\" (UID: \"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.639803 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" event={"ID":"8d82cfc5-b2ed-4877-b4ef-e0f55b340fef","Type":"ContainerStarted","Data":"668d3ab751831baa2c5a469a8185024aacc2f6d01a8aa2c9187a83ed1f89c271"} Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.643225 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" event={"ID":"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9","Type":"ContainerStarted","Data":"099d034a72f9ccd587030c2474582c1c5301770fffb49ffe621106de8c684d17"} Feb 17 14:54:37 crc kubenswrapper[4717]: W0217 14:54:37.646062 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d9ac42c_b34a_4fff_b611_176fe523d337.slice/crio-843237f1aab4e9089f9d0aedd7b5c4893db41d03315d135b31d567b6cd372bb9 WatchSource:0}: Error finding container 843237f1aab4e9089f9d0aedd7b5c4893db41d03315d135b31d567b6cd372bb9: Status 404 returned error can't find the container with id 843237f1aab4e9089f9d0aedd7b5c4893db41d03315d135b31d567b6cd372bb9 Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.647031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xfb64" event={"ID":"44d6dc49-9b2a-408c-a83d-1d5baf26bba4","Type":"ContainerStarted","Data":"05ebe18194e903834f2a74339c6756f60fd89473be3af3668c728224eeebe512"} Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.652799 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.672193 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.692943 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.707768 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0354b01-9ce9-45bf-bddf-d74b45f815cf-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hd2cz\" (UID: \"a0354b01-9ce9-45bf-bddf-d74b45f815cf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.712482 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.732968 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.733853 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0354b01-9ce9-45bf-bddf-d74b45f815cf-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hd2cz\" (UID: \"a0354b01-9ce9-45bf-bddf-d74b45f815cf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.754130 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.772347 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.779272 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/53a28e17-d55d-4208-a23f-1254796e789f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gch26\" (UID: \"53a28e17-d55d-4208-a23f-1254796e789f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.779961 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8-profile-collector-cert\") pod \"catalog-operator-68c6474976-7268l\" (UID: \"f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.780728 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-secret-volume\") pod \"collect-profiles-29522325-45ck6\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.794241 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.812753 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.821109 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/057d447b-4684-4f9c-b3e3-357677289cb5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rk4xx\" (UID: \"057d447b-4684-4f9c-b3e3-357677289cb5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.832834 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.852339 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.872303 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.894276 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.906117 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/96493978-ef25-4f82-ab8f-29c966a22ac6-default-certificate\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.910827 4717 request.go:700] Waited for 1.013876253s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-stats-default&limit=500&resourceVersion=0 Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.912692 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.920649 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/96493978-ef25-4f82-ab8f-29c966a22ac6-stats-auth\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.933398 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.946385 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96493978-ef25-4f82-ab8f-29c966a22ac6-metrics-certs\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.953038 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.963468 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96493978-ef25-4f82-ab8f-29c966a22ac6-service-ca-bundle\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.971996 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.992974 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 14:54:37 crc kubenswrapper[4717]: I0217 14:54:37.998400 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9rzh4\" (UID: \"8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9rzh4" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.012596 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.026061 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8-srv-cert\") pod \"catalog-operator-68c6474976-7268l\" (UID: \"f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.032461 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.051776 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.071971 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.093698 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.113626 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.131818 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53a28e17-d55d-4208-a23f-1254796e789f-srv-cert\") pod \"olm-operator-6b444d44fb-gch26\" (UID: \"53a28e17-d55d-4208-a23f-1254796e789f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.156629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxqdd\" (UniqueName: \"kubernetes.io/projected/35e2909f-4827-4ad7-a8f2-b464f982967f-kube-api-access-cxqdd\") pod \"route-controller-manager-6576b87f9c-s8hhh\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.168417 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnssg\" (UniqueName: \"kubernetes.io/projected/dc1eca03-6063-41fd-bcab-799db27b8f23-kube-api-access-rnssg\") pod \"apiserver-7bbb656c7d-6h5hj\" (UID: \"dc1eca03-6063-41fd-bcab-799db27b8f23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.173207 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.192651 4717 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.192765 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b10850bc-6877-4ad3-bda3-d9307748b7e2-signing-key podName:b10850bc-6877-4ad3-bda3-d9307748b7e2 nodeName:}" failed. No retries permitted until 2026-02-17 14:54:38.692734518 +0000 UTC m=+145.108574994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/b10850bc-6877-4ad3-bda3-d9307748b7e2-signing-key") pod "service-ca-9c57cc56f-b649c" (UID: "b10850bc-6877-4ad3-bda3-d9307748b7e2") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.192826 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.192663 4717 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193023 4717 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193152 4717 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193034 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-operator-metrics podName:2fbfa975-4e24-4213-8368-ba1af6b39e21 nodeName:}" failed. No retries permitted until 2026-02-17 14:54:38.692977935 +0000 UTC m=+145.108818591 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-operator-metrics") pod "marketplace-operator-79b997595-bxg9v" (UID: "2fbfa975-4e24-4213-8368-ba1af6b39e21") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193241 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18b1991-622b-438d-b168-91e5a21ad0f0-webhook-cert podName:a18b1991-622b-438d-b168-91e5a21ad0f0 nodeName:}" failed. No retries permitted until 2026-02-17 14:54:38.693198121 +0000 UTC m=+145.109038767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/a18b1991-622b-438d-b168-91e5a21ad0f0-webhook-cert") pod "packageserver-d55dfcdfc-zxwxs" (UID: "a18b1991-622b-438d-b168-91e5a21ad0f0") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193261 4717 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193291 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b10850bc-6877-4ad3-bda3-d9307748b7e2-signing-cabundle podName:b10850bc-6877-4ad3-bda3-d9307748b7e2 nodeName:}" failed. No retries permitted until 2026-02-17 14:54:38.693275963 +0000 UTC m=+145.109116639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/b10850bc-6877-4ad3-bda3-d9307748b7e2-signing-cabundle") pod "service-ca-9c57cc56f-b649c" (UID: "b10850bc-6877-4ad3-bda3-d9307748b7e2") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193269 4717 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193337 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/efd2934c-3d71-447e-bf59-2fd6a1dd6966-config podName:efd2934c-3d71-447e-bf59-2fd6a1dd6966 nodeName:}" failed. No retries permitted until 2026-02-17 14:54:38.693314474 +0000 UTC m=+145.109155180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/efd2934c-3d71-447e-bf59-2fd6a1dd6966-config") pod "service-ca-operator-777779d784-4kndh" (UID: "efd2934c-3d71-447e-bf59-2fd6a1dd6966") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193357 4717 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193389 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efd2934c-3d71-447e-bf59-2fd6a1dd6966-serving-cert podName:efd2934c-3d71-447e-bf59-2fd6a1dd6966 nodeName:}" failed. No retries permitted until 2026-02-17 14:54:38.693358636 +0000 UTC m=+145.109199332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/efd2934c-3d71-447e-bf59-2fd6a1dd6966-serving-cert") pod "service-ca-operator-777779d784-4kndh" (UID: "efd2934c-3d71-447e-bf59-2fd6a1dd6966") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193402 4717 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193412 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-config-volume podName:d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a nodeName:}" failed. No retries permitted until 2026-02-17 14:54:38.693403187 +0000 UTC m=+145.109243663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-config-volume") pod "collect-profiles-29522325-45ck6" (UID: "d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193464 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a18b1991-622b-438d-b168-91e5a21ad0f0-apiservice-cert podName:a18b1991-622b-438d-b168-91e5a21ad0f0 nodeName:}" failed. No retries permitted until 2026-02-17 14:54:38.693445678 +0000 UTC m=+145.109286384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a18b1991-622b-438d-b168-91e5a21ad0f0-apiservice-cert") pod "packageserver-d55dfcdfc-zxwxs" (UID: "a18b1991-622b-438d-b168-91e5a21ad0f0") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193609 4717 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: E0217 14:54:38.193802 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-trusted-ca podName:2fbfa975-4e24-4213-8368-ba1af6b39e21 nodeName:}" failed. No retries permitted until 2026-02-17 14:54:38.693773277 +0000 UTC m=+145.109613753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-trusted-ca") pod "marketplace-operator-79b997595-bxg9v" (UID: "2fbfa975-4e24-4213-8368-ba1af6b39e21") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.205405 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.212502 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.213768 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.232991 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.255667 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.287402 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.292571 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.312485 4717 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.332377 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.352836 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.373069 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.393012 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.414240 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.441540 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.452002 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.468121 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj"] Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.472807 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.491733 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.513096 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.533290 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.553143 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.572820 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.593312 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.612334 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.622674 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh"] Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.632528 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.653870 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" event={"ID":"8d82cfc5-b2ed-4877-b4ef-e0f55b340fef","Type":"ContainerStarted","Data":"ccab4a486d9020e141a8857a85374fdb083104d2c41ad04f7408994ca29dd51b"} Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.659279 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" event={"ID":"ba04adcf-e228-4486-a74a-e9846bdaa53f","Type":"ContainerStarted","Data":"fbab591bb00c5d0c46dffa669caf8cc81ba5bf11bb74b921f18c79bc3118f004"} Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.659684 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" event={"ID":"ba04adcf-e228-4486-a74a-e9846bdaa53f","Type":"ContainerStarted","Data":"5a4e12e4e2ee04f8157918129af83234358f91201cc5defe33e35a48e0deb6ac"} Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.659709 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" event={"ID":"ba04adcf-e228-4486-a74a-e9846bdaa53f","Type":"ContainerStarted","Data":"e97200a7b86305aea5594c2c4d8ebe2ebade85953a0507944e7ab83a745e72d9"} Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.661593 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" event={"ID":"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9","Type":"ContainerStarted","Data":"424208286b3f0d793fb439f8221c7f6f8b6c70d82a48a111ae2d79a8bd270141"} Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.663886 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" event={"ID":"dc1eca03-6063-41fd-bcab-799db27b8f23","Type":"ContainerStarted","Data":"d1237f66ea544808e9fa0dd5ba051fbbcf73d66ac6d81d867156801c684a0ea5"} Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.664481 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.664913 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" event={"ID":"35e2909f-4827-4ad7-a8f2-b464f982967f","Type":"ContainerStarted","Data":"938672057d00dff7db8c5111a36df1c25beba6288838ff74fb86d77cc988153a"} Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.667585 4717 generic.go:334] "Generic (PLEG): container finished" podID="44d6dc49-9b2a-408c-a83d-1d5baf26bba4" containerID="6fdc0fad1f085a0e6c75f37f980dcdb8ae0e06e43b8dfb2ac9c25be7b94046fe" exitCode=0 Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.667683 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xfb64" event={"ID":"44d6dc49-9b2a-408c-a83d-1d5baf26bba4","Type":"ContainerDied","Data":"6fdc0fad1f085a0e6c75f37f980dcdb8ae0e06e43b8dfb2ac9c25be7b94046fe"} Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.670436 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" event={"ID":"3d9ac42c-b34a-4fff-b611-176fe523d337","Type":"ContainerStarted","Data":"0fef3cc2a28a70b505322768918c9e8e54fb1ecdaaabc33e3d4c490acfd42f51"} Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.670484 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" event={"ID":"3d9ac42c-b34a-4fff-b611-176fe523d337","Type":"ContainerStarted","Data":"843237f1aab4e9089f9d0aedd7b5c4893db41d03315d135b31d567b6cd372bb9"} Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.672588 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.726788 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a18b1991-622b-438d-b168-91e5a21ad0f0-webhook-cert\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.726879 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b10850bc-6877-4ad3-bda3-d9307748b7e2-signing-key\") pod \"service-ca-9c57cc56f-b649c\" (UID: \"b10850bc-6877-4ad3-bda3-d9307748b7e2\") " pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.726939 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bxg9v\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.727042 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a18b1991-622b-438d-b168-91e5a21ad0f0-apiservice-cert\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.727123 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efd2934c-3d71-447e-bf59-2fd6a1dd6966-config\") pod \"service-ca-operator-777779d784-4kndh\" (UID: \"efd2934c-3d71-447e-bf59-2fd6a1dd6966\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.727181 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-config-volume\") pod \"collect-profiles-29522325-45ck6\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.727722 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b10850bc-6877-4ad3-bda3-d9307748b7e2-signing-cabundle\") pod \"service-ca-9c57cc56f-b649c\" (UID: \"b10850bc-6877-4ad3-bda3-d9307748b7e2\") " pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.727766 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efd2934c-3d71-447e-bf59-2fd6a1dd6966-serving-cert\") pod \"service-ca-operator-777779d784-4kndh\" (UID: \"efd2934c-3d71-447e-bf59-2fd6a1dd6966\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.727860 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bxg9v\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.728247 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-config-volume\") pod \"collect-profiles-29522325-45ck6\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.728985 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efd2934c-3d71-447e-bf59-2fd6a1dd6966-config\") pod \"service-ca-operator-777779d784-4kndh\" (UID: \"efd2934c-3d71-447e-bf59-2fd6a1dd6966\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.732095 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bxg9v\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.732788 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b10850bc-6877-4ad3-bda3-d9307748b7e2-signing-cabundle\") pod \"service-ca-9c57cc56f-b649c\" (UID: \"b10850bc-6877-4ad3-bda3-d9307748b7e2\") " pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.732941 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a18b1991-622b-438d-b168-91e5a21ad0f0-webhook-cert\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.732959 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b10850bc-6877-4ad3-bda3-d9307748b7e2-signing-key\") pod \"service-ca-9c57cc56f-b649c\" (UID: \"b10850bc-6877-4ad3-bda3-d9307748b7e2\") " pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.734846 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d6cx\" (UniqueName: \"kubernetes.io/projected/eee77ab9-8247-4813-8137-d627a31c8840-kube-api-access-6d6cx\") pod \"oauth-openshift-558db77b4-6lpz6\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.735767 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bxg9v\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.736489 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efd2934c-3d71-447e-bf59-2fd6a1dd6966-serving-cert\") pod \"service-ca-operator-777779d784-4kndh\" (UID: \"efd2934c-3d71-447e-bf59-2fd6a1dd6966\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.739239 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a18b1991-622b-438d-b168-91e5a21ad0f0-apiservice-cert\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.746549 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll6d5\" (UniqueName: \"kubernetes.io/projected/45c674d2-b259-4f0d-a92f-5a906aa07392-kube-api-access-ll6d5\") pod \"controller-manager-879f6c89f-nfj6t\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.750443 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2htq2\" (UniqueName: \"kubernetes.io/projected/35921a00-2137-41b1-b509-774ea0622ba3-kube-api-access-2htq2\") pod \"openshift-controller-manager-operator-756b6f6bc6-d8g2p\" (UID: \"35921a00-2137-41b1-b509-774ea0622ba3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.774171 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.774399 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.792797 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.814380 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.821819 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.833519 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.853364 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.882358 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.893192 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.912228 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.931160 4717 request.go:700] Waited for 1.91728534s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/configmaps?fieldSelector=metadata.name%3Ddns-default&limit=500&resourceVersion=0 Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.933904 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.953548 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 14:54:38 crc kubenswrapper[4717]: I0217 14:54:38.955195 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.016166 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b64270c-ff7e-4ad1-ad73-98c5e14346e0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-plkvk\" (UID: \"9b64270c-ff7e-4ad1-ad73-98c5e14346e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.032346 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjl9z\" (UniqueName: \"kubernetes.io/projected/72da7396-e722-4b51-9dbb-2f32b9f490b4-kube-api-access-rjl9z\") pod \"openshift-config-operator-7777fb866f-6ds8r\" (UID: \"72da7396-e722-4b51-9dbb-2f32b9f490b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.053381 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwq7p\" (UniqueName: \"kubernetes.io/projected/61c25e66-efcf-43a0-bcc4-922e00481359-kube-api-access-rwq7p\") pod \"migrator-59844c95c7-6ld9l\" (UID: \"61c25e66-efcf-43a0-bcc4-922e00481359\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6ld9l" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.085043 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnfcc\" (UniqueName: \"kubernetes.io/projected/9c4035af-26c4-4c7d-85f6-30fa7b167a43-kube-api-access-wnfcc\") pod \"dns-operator-744455d44c-77x5j\" (UID: \"9c4035af-26c4-4c7d-85f6-30fa7b167a43\") " pod="openshift-dns-operator/dns-operator-744455d44c-77x5j" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.117785 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx95w\" (UniqueName: \"kubernetes.io/projected/5aed9e9a-0eff-476d-bf5c-e268f4e16e06-kube-api-access-qx95w\") pod \"downloads-7954f5f757-r2gmn\" (UID: \"5aed9e9a-0eff-476d-bf5c-e268f4e16e06\") " pod="openshift-console/downloads-7954f5f757-r2gmn" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.144873 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.151536 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-r2gmn" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.160228 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nfj6t"] Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.161044 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5tbr\" (UniqueName: \"kubernetes.io/projected/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-kube-api-access-x5tbr\") pod \"console-f9d7485db-fbqb2\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.172832 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fxnh\" (UniqueName: \"kubernetes.io/projected/f7daba40-adba-48a1-a1c1-139a096354b2-kube-api-access-7fxnh\") pod \"etcd-operator-b45778765-smmgb\" (UID: \"f7daba40-adba-48a1-a1c1-139a096354b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.186781 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p"] Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.195252 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.197443 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzdqj\" (UniqueName: \"kubernetes.io/projected/3de82a84-a13d-4bf9-a242-2365193a9d62-kube-api-access-lzdqj\") pod \"ingress-operator-5b745b69d9-67r2l\" (UID: \"3de82a84-a13d-4bf9-a242-2365193a9d62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.202058 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.210219 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90118620-9ff2-42f8-a7d4-398df433f1e6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-65gm9\" (UID: \"90118620-9ff2-42f8-a7d4-398df433f1e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.210267 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-77x5j" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.211675 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qrbp\" (UniqueName: \"kubernetes.io/projected/90118620-9ff2-42f8-a7d4-398df433f1e6-kube-api-access-4qrbp\") pod \"cluster-image-registry-operator-dc59b4c8b-65gm9\" (UID: \"90118620-9ff2-42f8-a7d4-398df433f1e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.212150 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7wsq\" (UniqueName: \"kubernetes.io/projected/1416b738-4058-4e5b-be77-62d0b9172a13-kube-api-access-l7wsq\") pod \"console-operator-58897d9998-kdfsw\" (UID: \"1416b738-4058-4e5b-be77-62d0b9172a13\") " pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.227205 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6lpz6"] Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.227449 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z894w\" (UniqueName: \"kubernetes.io/projected/cf7e2512-6897-4c4d-b28f-c79deecee58b-kube-api-access-z894w\") pod \"cluster-samples-operator-665b6dd947-6vx9t\" (UID: \"cf7e2512-6897-4c4d-b28f-c79deecee58b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t" Feb 17 14:54:39 crc kubenswrapper[4717]: W0217 14:54:39.242914 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeee77ab9_8247_4813_8137_d627a31c8840.slice/crio-f6603c8f36fde83763130015cdc80da5c9f642485428f0e328f4ddc9728e66d3 WatchSource:0}: Error finding container f6603c8f36fde83763130015cdc80da5c9f642485428f0e328f4ddc9728e66d3: Status 404 returned error can't find the container with id f6603c8f36fde83763130015cdc80da5c9f642485428f0e328f4ddc9728e66d3 Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.247515 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3de82a84-a13d-4bf9-a242-2365193a9d62-bound-sa-token\") pod \"ingress-operator-5b745b69d9-67r2l\" (UID: \"3de82a84-a13d-4bf9-a242-2365193a9d62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.247774 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6ld9l" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.274003 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k78mt\" (UniqueName: \"kubernetes.io/projected/057d447b-4684-4f9c-b3e3-357677289cb5-kube-api-access-k78mt\") pod \"package-server-manager-789f6589d5-rk4xx\" (UID: \"057d447b-4684-4f9c-b3e3-357677289cb5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.294615 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qw6z\" (UniqueName: \"kubernetes.io/projected/5ab2c5e9-8f67-4615-b1a0-60fa635a87fc-kube-api-access-7qw6z\") pod \"machine-config-operator-74547568cd-dlhxh\" (UID: \"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.311538 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0597617d-4007-42f7-97b2-660584569977-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6hm8n\" (UID: \"0597617d-4007-42f7-97b2-660584569977\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.333877 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvbt8\" (UniqueName: \"kubernetes.io/projected/8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a-kube-api-access-dvbt8\") pod \"multus-admission-controller-857f4d67dd-9rzh4\" (UID: \"8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9rzh4" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.350429 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6kg8\" (UniqueName: \"kubernetes.io/projected/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-kube-api-access-r6kg8\") pod \"collect-profiles-29522325-45ck6\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.380070 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7bhq\" (UniqueName: \"kubernetes.io/projected/8cf47e32-ca08-44e4-b0c2-07c6d53b8aec-kube-api-access-n7bhq\") pod \"csi-hostpathplugin-wxgjx\" (UID: \"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec\") " pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.396289 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wg5b\" (UniqueName: \"kubernetes.io/projected/f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8-kube-api-access-7wg5b\") pod \"catalog-operator-68c6474976-7268l\" (UID: \"f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.427703 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwmqn\" (UniqueName: \"kubernetes.io/projected/53a28e17-d55d-4208-a23f-1254796e789f-kube-api-access-kwmqn\") pod \"olm-operator-6b444d44fb-gch26\" (UID: \"53a28e17-d55d-4208-a23f-1254796e789f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.428897 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.436553 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkpwb\" (UniqueName: \"kubernetes.io/projected/2fbfa975-4e24-4213-8368-ba1af6b39e21-kube-api-access-bkpwb\") pod \"marketplace-operator-79b997595-bxg9v\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.438237 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.451534 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0354b01-9ce9-45bf-bddf-d74b45f815cf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hd2cz\" (UID: \"a0354b01-9ce9-45bf-bddf-d74b45f815cf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.460839 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.471283 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.479475 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.480975 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjh5c\" (UniqueName: \"kubernetes.io/projected/a18b1991-622b-438d-b168-91e5a21ad0f0-kube-api-access-zjh5c\") pod \"packageserver-d55dfcdfc-zxwxs\" (UID: \"a18b1991-622b-438d-b168-91e5a21ad0f0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.499023 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b45x2\" (UniqueName: \"kubernetes.io/projected/b10850bc-6877-4ad3-bda3-d9307748b7e2-kube-api-access-b45x2\") pod \"service-ca-9c57cc56f-b649c\" (UID: \"b10850bc-6877-4ad3-bda3-d9307748b7e2\") " pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.516630 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4mtb\" (UniqueName: \"kubernetes.io/projected/efd2934c-3d71-447e-bf59-2fd6a1dd6966-kube-api-access-x4mtb\") pod \"service-ca-operator-777779d784-4kndh\" (UID: \"efd2934c-3d71-447e-bf59-2fd6a1dd6966\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.526692 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.534129 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.539714 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.544486 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qljlc\" (UniqueName: \"kubernetes.io/projected/96493978-ef25-4f82-ab8f-29c966a22ac6-kube-api-access-qljlc\") pod \"router-default-5444994796-jp5jl\" (UID: \"96493978-ef25-4f82-ab8f-29c966a22ac6\") " pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.557056 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.563622 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fft6\" (UniqueName: \"kubernetes.io/projected/1fc864b6-ff33-49d1-bed4-c1e32a47c747-kube-api-access-2fft6\") pod \"machine-config-controller-84d6567774-mnnjg\" (UID: \"1fc864b6-ff33-49d1-bed4-c1e32a47c747\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.565475 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.573898 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9rzh4" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.574764 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r"] Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.582376 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.597766 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.602454 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.611672 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.636276 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.643314 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.650489 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwmxv\" (UniqueName: \"kubernetes.io/projected/f576665e-17f5-4704-bd20-5debf9fb8612-kube-api-access-hwmxv\") pod \"control-plane-machine-set-operator-78cbb6b69f-vxkqh\" (UID: \"f576665e-17f5-4704-bd20-5debf9fb8612\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.650602 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f576665e-17f5-4704-bd20-5debf9fb8612-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vxkqh\" (UID: \"f576665e-17f5-4704-bd20-5debf9fb8612\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.650742 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-registry-certificates\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.650841 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lskcp\" (UniqueName: \"kubernetes.io/projected/a92032d2-9ec0-4dfd-8545-aa72231c1ca9-kube-api-access-lskcp\") pod \"kube-storage-version-migrator-operator-b67b599dd-fml67\" (UID: \"a92032d2-9ec0-4dfd-8545-aa72231c1ca9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.650903 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-registry-tls\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.650983 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a92032d2-9ec0-4dfd-8545-aa72231c1ca9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fml67\" (UID: \"a92032d2-9ec0-4dfd-8545-aa72231c1ca9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.651058 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-trusted-ca\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.652026 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-bound-sa-token\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.652152 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-b649c" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.652282 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.652302 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cfv5\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-kube-api-access-7cfv5\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.652357 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a92032d2-9ec0-4dfd-8545-aa72231c1ca9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fml67\" (UID: \"a92032d2-9ec0-4dfd-8545-aa72231c1ca9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.653093 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.653174 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: E0217 14:54:39.653651 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:40.153631773 +0000 UTC m=+146.569472249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.660721 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.745839 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-smmgb"] Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.754151 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:39 crc kubenswrapper[4717]: E0217 14:54:39.754402 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:40.254365524 +0000 UTC m=+146.670206000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.754567 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lskcp\" (UniqueName: \"kubernetes.io/projected/a92032d2-9ec0-4dfd-8545-aa72231c1ca9-kube-api-access-lskcp\") pod \"kube-storage-version-migrator-operator-b67b599dd-fml67\" (UID: \"a92032d2-9ec0-4dfd-8545-aa72231c1ca9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.755336 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8v2c\" (UniqueName: \"kubernetes.io/projected/e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed-kube-api-access-v8v2c\") pod \"dns-default-whb99\" (UID: \"e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed\") " pod="openshift-dns/dns-default-whb99" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.755397 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97dfcf45-5b8b-4c0a-9738-6568da5ec035-cert\") pod \"ingress-canary-nqkpb\" (UID: \"97dfcf45-5b8b-4c0a-9738-6568da5ec035\") " pod="openshift-ingress-canary/ingress-canary-nqkpb" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.755485 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-registry-tls\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.755504 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwf5d\" (UniqueName: \"kubernetes.io/projected/2f7517f5-ede0-45bc-8407-d24d5e173301-kube-api-access-bwf5d\") pod \"machine-config-server-djfvl\" (UID: \"2f7517f5-ede0-45bc-8407-d24d5e173301\") " pod="openshift-machine-config-operator/machine-config-server-djfvl" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.755604 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a92032d2-9ec0-4dfd-8545-aa72231c1ca9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fml67\" (UID: \"a92032d2-9ec0-4dfd-8545-aa72231c1ca9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.755647 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2f7517f5-ede0-45bc-8407-d24d5e173301-node-bootstrap-token\") pod \"machine-config-server-djfvl\" (UID: \"2f7517f5-ede0-45bc-8407-d24d5e173301\") " pod="openshift-machine-config-operator/machine-config-server-djfvl" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.757223 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a92032d2-9ec0-4dfd-8545-aa72231c1ca9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fml67\" (UID: \"a92032d2-9ec0-4dfd-8545-aa72231c1ca9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.759044 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-trusted-ca\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.759422 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed-config-volume\") pod \"dns-default-whb99\" (UID: \"e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed\") " pod="openshift-dns/dns-default-whb99" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.760770 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-trusted-ca\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.760774 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-bound-sa-token\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.761143 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.761359 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cfv5\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-kube-api-access-7cfv5\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.761567 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a92032d2-9ec0-4dfd-8545-aa72231c1ca9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fml67\" (UID: \"a92032d2-9ec0-4dfd-8545-aa72231c1ca9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.762855 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.763573 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.763672 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: E0217 14:54:39.764241 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:40.264216373 +0000 UTC m=+146.680056849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.764859 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2f7517f5-ede0-45bc-8407-d24d5e173301-certs\") pod \"machine-config-server-djfvl\" (UID: \"2f7517f5-ede0-45bc-8407-d24d5e173301\") " pod="openshift-machine-config-operator/machine-config-server-djfvl" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.764969 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwmxv\" (UniqueName: \"kubernetes.io/projected/f576665e-17f5-4704-bd20-5debf9fb8612-kube-api-access-hwmxv\") pod \"control-plane-machine-set-operator-78cbb6b69f-vxkqh\" (UID: \"f576665e-17f5-4704-bd20-5debf9fb8612\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.765129 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed-metrics-tls\") pod \"dns-default-whb99\" (UID: \"e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed\") " pod="openshift-dns/dns-default-whb99" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.765228 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f576665e-17f5-4704-bd20-5debf9fb8612-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vxkqh\" (UID: \"f576665e-17f5-4704-bd20-5debf9fb8612\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.765930 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgf8g\" (UniqueName: \"kubernetes.io/projected/97dfcf45-5b8b-4c0a-9738-6568da5ec035-kube-api-access-wgf8g\") pod \"ingress-canary-nqkpb\" (UID: \"97dfcf45-5b8b-4c0a-9738-6568da5ec035\") " pod="openshift-ingress-canary/ingress-canary-nqkpb" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.769130 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-registry-certificates\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.769947 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-registry-certificates\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.772172 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-registry-tls\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.772348 4717 generic.go:334] "Generic (PLEG): container finished" podID="dc1eca03-6063-41fd-bcab-799db27b8f23" containerID="7ed641c2e0f92e12a1e79729f6ebcc89ab303935e3d8d256cabad1b9a2dd2056" exitCode=0 Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.772893 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" event={"ID":"dc1eca03-6063-41fd-bcab-799db27b8f23","Type":"ContainerDied","Data":"7ed641c2e0f92e12a1e79729f6ebcc89ab303935e3d8d256cabad1b9a2dd2056"} Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.775633 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a92032d2-9ec0-4dfd-8545-aa72231c1ca9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fml67\" (UID: \"a92032d2-9ec0-4dfd-8545-aa72231c1ca9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.778703 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f576665e-17f5-4704-bd20-5debf9fb8612-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vxkqh\" (UID: \"f576665e-17f5-4704-bd20-5debf9fb8612\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.786685 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.794399 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lskcp\" (UniqueName: \"kubernetes.io/projected/a92032d2-9ec0-4dfd-8545-aa72231c1ca9-kube-api-access-lskcp\") pod \"kube-storage-version-migrator-operator-b67b599dd-fml67\" (UID: \"a92032d2-9ec0-4dfd-8545-aa72231c1ca9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.797691 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" event={"ID":"35e2909f-4827-4ad7-a8f2-b464f982967f","Type":"ContainerStarted","Data":"27eeee204c2feff5387bb12477bccdf978d6ea7fcd7d38221dbfe4cff76fc4d1"} Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.798683 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.817631 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.824900 4717 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-s8hhh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.824978 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" podUID="35e2909f-4827-4ad7-a8f2-b464f982967f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.833705 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" event={"ID":"eee77ab9-8247-4813-8137-d627a31c8840","Type":"ContainerStarted","Data":"f6603c8f36fde83763130015cdc80da5c9f642485428f0e328f4ddc9728e66d3"} Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.837697 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" event={"ID":"72da7396-e722-4b51-9dbb-2f32b9f490b4","Type":"ContainerStarted","Data":"433e85c7cc21d93b51bf6895c16d1b13b7c1fe4352a13e156ae0e029323ba8de"} Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.867369 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-bound-sa-token\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.875937 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:39 crc kubenswrapper[4717]: E0217 14:54:39.876321 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:40.376295204 +0000 UTC m=+146.792135680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.876376 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2f7517f5-ede0-45bc-8407-d24d5e173301-node-bootstrap-token\") pod \"machine-config-server-djfvl\" (UID: \"2f7517f5-ede0-45bc-8407-d24d5e173301\") " pod="openshift-machine-config-operator/machine-config-server-djfvl" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.876416 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed-config-volume\") pod \"dns-default-whb99\" (UID: \"e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed\") " pod="openshift-dns/dns-default-whb99" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.876543 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.876616 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2f7517f5-ede0-45bc-8407-d24d5e173301-certs\") pod \"machine-config-server-djfvl\" (UID: \"2f7517f5-ede0-45bc-8407-d24d5e173301\") " pod="openshift-machine-config-operator/machine-config-server-djfvl" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.876670 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed-metrics-tls\") pod \"dns-default-whb99\" (UID: \"e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed\") " pod="openshift-dns/dns-default-whb99" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.876723 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgf8g\" (UniqueName: \"kubernetes.io/projected/97dfcf45-5b8b-4c0a-9738-6568da5ec035-kube-api-access-wgf8g\") pod \"ingress-canary-nqkpb\" (UID: \"97dfcf45-5b8b-4c0a-9738-6568da5ec035\") " pod="openshift-ingress-canary/ingress-canary-nqkpb" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.876803 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8v2c\" (UniqueName: \"kubernetes.io/projected/e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed-kube-api-access-v8v2c\") pod \"dns-default-whb99\" (UID: \"e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed\") " pod="openshift-dns/dns-default-whb99" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.876833 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97dfcf45-5b8b-4c0a-9738-6568da5ec035-cert\") pod \"ingress-canary-nqkpb\" (UID: \"97dfcf45-5b8b-4c0a-9738-6568da5ec035\") " pod="openshift-ingress-canary/ingress-canary-nqkpb" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.876860 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwf5d\" (UniqueName: \"kubernetes.io/projected/2f7517f5-ede0-45bc-8407-d24d5e173301-kube-api-access-bwf5d\") pod \"machine-config-server-djfvl\" (UID: \"2f7517f5-ede0-45bc-8407-d24d5e173301\") " pod="openshift-machine-config-operator/machine-config-server-djfvl" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.877820 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed-config-volume\") pod \"dns-default-whb99\" (UID: \"e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed\") " pod="openshift-dns/dns-default-whb99" Feb 17 14:54:39 crc kubenswrapper[4717]: E0217 14:54:39.878481 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:40.378466036 +0000 UTC m=+146.794306512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.881532 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97dfcf45-5b8b-4c0a-9738-6568da5ec035-cert\") pod \"ingress-canary-nqkpb\" (UID: \"97dfcf45-5b8b-4c0a-9738-6568da5ec035\") " pod="openshift-ingress-canary/ingress-canary-nqkpb" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.883522 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cfv5\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-kube-api-access-7cfv5\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.887341 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwmxv\" (UniqueName: \"kubernetes.io/projected/f576665e-17f5-4704-bd20-5debf9fb8612-kube-api-access-hwmxv\") pod \"control-plane-machine-set-operator-78cbb6b69f-vxkqh\" (UID: \"f576665e-17f5-4704-bd20-5debf9fb8612\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.887785 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.888370 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2f7517f5-ede0-45bc-8407-d24d5e173301-certs\") pod \"machine-config-server-djfvl\" (UID: \"2f7517f5-ede0-45bc-8407-d24d5e173301\") " pod="openshift-machine-config-operator/machine-config-server-djfvl" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.890354 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2f7517f5-ede0-45bc-8407-d24d5e173301-node-bootstrap-token\") pod \"machine-config-server-djfvl\" (UID: \"2f7517f5-ede0-45bc-8407-d24d5e173301\") " pod="openshift-machine-config-operator/machine-config-server-djfvl" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.896620 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed-metrics-tls\") pod \"dns-default-whb99\" (UID: \"e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed\") " pod="openshift-dns/dns-default-whb99" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.933035 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwf5d\" (UniqueName: \"kubernetes.io/projected/2f7517f5-ede0-45bc-8407-d24d5e173301-kube-api-access-bwf5d\") pod \"machine-config-server-djfvl\" (UID: \"2f7517f5-ede0-45bc-8407-d24d5e173301\") " pod="openshift-machine-config-operator/machine-config-server-djfvl" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.947713 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgf8g\" (UniqueName: \"kubernetes.io/projected/97dfcf45-5b8b-4c0a-9738-6568da5ec035-kube-api-access-wgf8g\") pod \"ingress-canary-nqkpb\" (UID: \"97dfcf45-5b8b-4c0a-9738-6568da5ec035\") " pod="openshift-ingress-canary/ingress-canary-nqkpb" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.970581 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8v2c\" (UniqueName: \"kubernetes.io/projected/e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed-kube-api-access-v8v2c\") pod \"dns-default-whb99\" (UID: \"e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed\") " pod="openshift-dns/dns-default-whb99" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.976118 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-djfvl" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.977359 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:39 crc kubenswrapper[4717]: E0217 14:54:39.977667 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:40.477634572 +0000 UTC m=+146.893475048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.978103 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:39 crc kubenswrapper[4717]: E0217 14:54:39.978635 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:40.47862272 +0000 UTC m=+146.894463196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.982176 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nqkpb" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.984725 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh" Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.987846 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" event={"ID":"45c674d2-b259-4f0d-a92f-5a906aa07392","Type":"ContainerStarted","Data":"6da73034eafcd6e21c5a32224b35d2b47c1187f840ebb4ac7596220c38865ee1"} Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.987906 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6ld9l"] Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.987930 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" event={"ID":"35921a00-2137-41b1-b509-774ea0622ba3","Type":"ContainerStarted","Data":"6da78179eff9d69901064c872b6582c4b82ffbc70137d632a422aace47e66849"} Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.987947 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-r2gmn"] Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.987964 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" event={"ID":"8ce8e198-bfa7-4c6d-a15c-2b3732fa16f9","Type":"ContainerStarted","Data":"aa7721ad4d52bc2e45f4ceba374627793e2878b54369c1080c49b2a8f6970eaa"} Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.988008 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-77x5j"] Feb 17 14:54:39 crc kubenswrapper[4717]: I0217 14:54:39.989416 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-whb99" Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.079838 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:40 crc kubenswrapper[4717]: E0217 14:54:40.080246 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:40.580219156 +0000 UTC m=+146.996059632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.080935 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:40 crc kubenswrapper[4717]: E0217 14:54:40.083930 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:40.58391489 +0000 UTC m=+146.999755366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.116360 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk"] Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.187672 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:40 crc kubenswrapper[4717]: E0217 14:54:40.190801 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:40.690777875 +0000 UTC m=+147.106618351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.290368 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:40 crc kubenswrapper[4717]: E0217 14:54:40.290735 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:40.790720933 +0000 UTC m=+147.206561399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.395912 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:40 crc kubenswrapper[4717]: E0217 14:54:40.396363 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:40.896344022 +0000 UTC m=+147.312184508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.502999 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:40 crc kubenswrapper[4717]: E0217 14:54:40.507122 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.007100527 +0000 UTC m=+147.422941003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.604318 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:40 crc kubenswrapper[4717]: E0217 14:54:40.608075 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.108041234 +0000 UTC m=+147.523881710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.706352 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:40 crc kubenswrapper[4717]: E0217 14:54:40.706744 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.206728137 +0000 UTC m=+147.622568613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.807900 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:40 crc kubenswrapper[4717]: E0217 14:54:40.808215 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.308189708 +0000 UTC m=+147.724030184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.808728 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:40 crc kubenswrapper[4717]: E0217 14:54:40.809262 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.309234228 +0000 UTC m=+147.725074704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.927603 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:40 crc kubenswrapper[4717]: E0217 14:54:40.929453 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.429410449 +0000 UTC m=+147.845250925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.943466 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:40 crc kubenswrapper[4717]: E0217 14:54:40.944248 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.444227818 +0000 UTC m=+147.860068304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.951867 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" event={"ID":"eee77ab9-8247-4813-8137-d627a31c8840","Type":"ContainerStarted","Data":"e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0"} Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.956136 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.970869 4717 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6lpz6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.971271 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" podUID="eee77ab9-8247-4813-8137-d627a31c8840" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.984974 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n"] Feb 17 14:54:40 crc kubenswrapper[4717]: I0217 14:54:40.993156 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-r2gmn" event={"ID":"5aed9e9a-0eff-476d-bf5c-e268f4e16e06","Type":"ContainerStarted","Data":"6e4de107962d4cf821b889cfc10d13f1b0f56d25b8fa75832dca19c6e752d940"} Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:40.998777 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kdfsw"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:40.998891 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4jdgk" podStartSLOduration=122.998869784 podStartE2EDuration="2m2.998869784s" podCreationTimestamp="2026-02-17 14:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:40.969269587 +0000 UTC m=+147.385110083" watchObservedRunningTime="2026-02-17 14:54:40.998869784 +0000 UTC m=+147.414710260" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.003926 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.026568 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" event={"ID":"45c674d2-b259-4f0d-a92f-5a906aa07392","Type":"ContainerStarted","Data":"e0991f3cae46b3aab025eb65a4752385a7cef4b57ce19e0b8f93d06e86edde98"} Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.028215 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.044769 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.045540 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.545512664 +0000 UTC m=+147.961353140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.051487 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-djfvl" event={"ID":"2f7517f5-ede0-45bc-8407-d24d5e173301","Type":"ContainerStarted","Data":"a925a50364a35debe625469910d6bed0d90105c1118a827999566659fe494596"} Feb 17 14:54:41 crc kubenswrapper[4717]: W0217 14:54:41.052546 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0597617d_4007_42f7_97b2_660584569977.slice/crio-372f1d0041e79d253be5086f8faa3188c429606ae154d6515720f905ab75c2b4 WatchSource:0}: Error finding container 372f1d0041e79d253be5086f8faa3188c429606ae154d6515720f905ab75c2b4: Status 404 returned error can't find the container with id 372f1d0041e79d253be5086f8faa3188c429606ae154d6515720f905ab75c2b4 Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.052674 4717 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nfj6t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.052772 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" podUID="45c674d2-b259-4f0d-a92f-5a906aa07392" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.059553 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.069211 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-fbqb2"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.076471 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.080429 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-77x5j" event={"ID":"9c4035af-26c4-4c7d-85f6-30fa7b167a43","Type":"ContainerStarted","Data":"0fefcf2ed2f1bdb06ba573a4c02b0a336da40cc5fa09d03b2b9fc084931b5f68"} Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.123284 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.130998 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.132027 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jp5jl" event={"ID":"96493978-ef25-4f82-ab8f-29c966a22ac6","Type":"ContainerStarted","Data":"1a2b9e71d2f0a52bc60b563788fdb61d8805923791137bd92f729675cd30c6bd"} Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.132103 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jp5jl" event={"ID":"96493978-ef25-4f82-ab8f-29c966a22ac6","Type":"ContainerStarted","Data":"a3672a8a3db28aa6899c0372823c58f4e442acda1c17a700b62825bf7ae4b677"} Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.133107 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.145952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" event={"ID":"35921a00-2137-41b1-b509-774ea0622ba3","Type":"ContainerStarted","Data":"02402de6d10689eb422a4160212d1fa1c8b452bcfad1b3e632748c0b22093160"} Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.148798 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.150142 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4kndh"] Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.150232 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.650209657 +0000 UTC m=+148.066050133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.229904 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" event={"ID":"f7daba40-adba-48a1-a1c1-139a096354b2","Type":"ContainerStarted","Data":"4f328b477d0aac64cd605db89a65ed7c4cffc346178ff1d55b9ebfbe557acf4d"} Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.250730 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.254802 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.754781487 +0000 UTC m=+148.170621963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.269351 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vd4m7" podStartSLOduration=123.269327469 podStartE2EDuration="2m3.269327469s" podCreationTimestamp="2026-02-17 14:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:41.231245641 +0000 UTC m=+147.647086137" watchObservedRunningTime="2026-02-17 14:54:41.269327469 +0000 UTC m=+147.685167945" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.290021 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xfb64" event={"ID":"44d6dc49-9b2a-408c-a83d-1d5baf26bba4","Type":"ContainerStarted","Data":"2c25ae2616c5c777a175fedb57957b442245c3b308d05ded7642b334221865f8"} Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.299767 4717 generic.go:334] "Generic (PLEG): container finished" podID="72da7396-e722-4b51-9dbb-2f32b9f490b4" containerID="0167266eb899c4ac8263231a1a659883276d287fff1882a773446bf0ab103598" exitCode=0 Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.300221 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" event={"ID":"72da7396-e722-4b51-9dbb-2f32b9f490b4","Type":"ContainerDied","Data":"0167266eb899c4ac8263231a1a659883276d287fff1882a773446bf0ab103598"} Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.313591 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6ld9l" event={"ID":"61c25e66-efcf-43a0-bcc4-922e00481359","Type":"ContainerStarted","Data":"8459a2562aa7e4c910d7b84ee09c3ba29ef2e25d29430862846658f2c36fd059"} Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.313646 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6ld9l" event={"ID":"61c25e66-efcf-43a0-bcc4-922e00481359","Type":"ContainerStarted","Data":"fb08bfe5d11bea637d29c531d1e5682eee0190ee9e21cd734bfb30ab576c74c8"} Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.320531 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" event={"ID":"9b64270c-ff7e-4ad1-ad73-98c5e14346e0","Type":"ContainerStarted","Data":"0aa1a17263d622777dfbedaf69f22889528b764bed1d6528187ff053329aecee"} Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.335971 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.336618 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-549dm" podStartSLOduration=123.336583752 podStartE2EDuration="2m3.336583752s" podCreationTimestamp="2026-02-17 14:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:41.317344737 +0000 UTC m=+147.733185233" watchObservedRunningTime="2026-02-17 14:54:41.336583752 +0000 UTC m=+147.752424228" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.357670 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.357736 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-b649c"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.358157 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.359408 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.859379257 +0000 UTC m=+148.275219733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.367646 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26"] Feb 17 14:54:41 crc kubenswrapper[4717]: W0217 14:54:41.415112 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda18b1991_622b_438d_b168_91e5a21ad0f0.slice/crio-9027ca665f9afc817598ab189d03f50cd2b10cb66d42a8011daaef8cc9c9384a WatchSource:0}: Error finding container 9027ca665f9afc817598ab189d03f50cd2b10cb66d42a8011daaef8cc9c9384a: Status 404 returned error can't find the container with id 9027ca665f9afc817598ab189d03f50cd2b10cb66d42a8011daaef8cc9c9384a Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.459701 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.460468 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.960434257 +0000 UTC m=+148.376274733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.460617 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bxg9v"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.461162 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.462074 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:41.962056903 +0000 UTC m=+148.377897389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.482422 4717 csr.go:261] certificate signing request csr-n24gz is approved, waiting to be issued Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.485824 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.489193 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wxgjx"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.491377 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-9t64v" podStartSLOduration=122.491366102 podStartE2EDuration="2m2.491366102s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:41.483293694 +0000 UTC m=+147.899134170" watchObservedRunningTime="2026-02-17 14:54:41.491366102 +0000 UTC m=+147.907206578" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.494312 4717 csr.go:257] certificate signing request csr-n24gz is issued Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.524377 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" podStartSLOduration=122.524354426 podStartE2EDuration="2m2.524354426s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:41.512901022 +0000 UTC m=+147.928741498" watchObservedRunningTime="2026-02-17 14:54:41.524354426 +0000 UTC m=+147.940194902" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.524714 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9rzh4"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.530142 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.552833 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nqkpb"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.562736 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.563531 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.063507094 +0000 UTC m=+148.479347570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.566408 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.568712 4717 patch_prober.go:28] interesting pod/router-default-5444994796-jp5jl container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.569132 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jp5jl" podUID="96493978-ef25-4f82-ab8f-29c966a22ac6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 17 14:54:41 crc kubenswrapper[4717]: W0217 14:54:41.583855 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8a3fc6e_7e2e_4cc4_bbfe_c16e1299995a.slice/crio-b79a7613c3f0f14618c91ed84b16302e920a0301613f310c13c15ea057f45dcc WatchSource:0}: Error finding container b79a7613c3f0f14618c91ed84b16302e920a0301613f310c13c15ea057f45dcc: Status 404 returned error can't find the container with id b79a7613c3f0f14618c91ed84b16302e920a0301613f310c13c15ea057f45dcc Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.596009 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.596069 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.604109 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-whb99"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.606809 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67"] Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.669096 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.669669 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.169640267 +0000 UTC m=+148.585480883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.681999 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" podStartSLOduration=123.681975936 podStartE2EDuration="2m3.681975936s" podCreationTimestamp="2026-02-17 14:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:41.679892667 +0000 UTC m=+148.095733143" watchObservedRunningTime="2026-02-17 14:54:41.681975936 +0000 UTC m=+148.097816412" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.744034 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jp5jl" podStartSLOduration=122.744007421 podStartE2EDuration="2m2.744007421s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:41.710575865 +0000 UTC m=+148.126416341" watchObservedRunningTime="2026-02-17 14:54:41.744007421 +0000 UTC m=+148.159847897" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.744430 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d8g2p" podStartSLOduration=122.744423933 podStartE2EDuration="2m2.744423933s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:41.741754158 +0000 UTC m=+148.157594634" watchObservedRunningTime="2026-02-17 14:54:41.744423933 +0000 UTC m=+148.160264419" Feb 17 14:54:41 crc kubenswrapper[4717]: W0217 14:54:41.757753 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda92032d2_9ec0_4dfd_8545_aa72231c1ca9.slice/crio-507785876b08478aa0ac90d740b1035ebc839e822c0ca6fb63704da234cf36a4 WatchSource:0}: Error finding container 507785876b08478aa0ac90d740b1035ebc839e822c0ca6fb63704da234cf36a4: Status 404 returned error can't find the container with id 507785876b08478aa0ac90d740b1035ebc839e822c0ca6fb63704da234cf36a4 Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.771289 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.771501 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.271471779 +0000 UTC m=+148.687312255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.771724 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.772400 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.272382905 +0000 UTC m=+148.688223371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.809900 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" podStartSLOduration=122.809881406 podStartE2EDuration="2m2.809881406s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:41.808538048 +0000 UTC m=+148.224378524" watchObservedRunningTime="2026-02-17 14:54:41.809881406 +0000 UTC m=+148.225721882" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.858762 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" podStartSLOduration=122.858739789 podStartE2EDuration="2m2.858739789s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:41.855449985 +0000 UTC m=+148.271290461" watchObservedRunningTime="2026-02-17 14:54:41.858739789 +0000 UTC m=+148.274580265" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.874831 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.875037 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.375005359 +0000 UTC m=+148.790845825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.875165 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.875529 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.375516763 +0000 UTC m=+148.791357239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.886733 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-djfvl" podStartSLOduration=5.88671299 podStartE2EDuration="5.88671299s" podCreationTimestamp="2026-02-17 14:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:41.885576958 +0000 UTC m=+148.301417434" watchObservedRunningTime="2026-02-17 14:54:41.88671299 +0000 UTC m=+148.302553466" Feb 17 14:54:41 crc kubenswrapper[4717]: I0217 14:54:41.976768 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:41 crc kubenswrapper[4717]: E0217 14:54:41.977285 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.477255833 +0000 UTC m=+148.893096309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.081033 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:42 crc kubenswrapper[4717]: E0217 14:54:42.081799 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.58177076 +0000 UTC m=+148.997611446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.186226 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:42 crc kubenswrapper[4717]: E0217 14:54:42.186559 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.686518155 +0000 UTC m=+149.102358631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.186672 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:42 crc kubenswrapper[4717]: E0217 14:54:42.187296 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.687270526 +0000 UTC m=+149.103111012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.290760 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:42 crc kubenswrapper[4717]: E0217 14:54:42.292273 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.792230347 +0000 UTC m=+149.208070823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.293475 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:42 crc kubenswrapper[4717]: E0217 14:54:42.293901 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.793892824 +0000 UTC m=+149.209733300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.330479 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" event={"ID":"1fc864b6-ff33-49d1-bed4-c1e32a47c747","Type":"ContainerStarted","Data":"c7fd8807c9488fa8380db0ecd0fb5e6f6adf18fbd6c5a77966c09dc612fa9957"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.334814 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" event={"ID":"057d447b-4684-4f9c-b3e3-357677289cb5","Type":"ContainerStarted","Data":"3f8617ddd69a55ab061c18811d81022417614933bf48d9817be0fac81f4719a4"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.338476 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" event={"ID":"a0354b01-9ce9-45bf-bddf-d74b45f815cf","Type":"ContainerStarted","Data":"70b1ca74d5e384bf5532d39a087433e34e05c28de0b683932b4f6d6ceee30646"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.338511 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" event={"ID":"a0354b01-9ce9-45bf-bddf-d74b45f815cf","Type":"ContainerStarted","Data":"98b85f7c247b2f2222f0ecb3705efd393c382df3a02a446336241f30d20e052c"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.340620 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" event={"ID":"0597617d-4007-42f7-97b2-660584569977","Type":"ContainerStarted","Data":"dacb7fc265fe9837d4ebf88a052b3aa62069b7338a2a058e3397f651ee9df1a4"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.340685 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" event={"ID":"0597617d-4007-42f7-97b2-660584569977","Type":"ContainerStarted","Data":"372f1d0041e79d253be5086f8faa3188c429606ae154d6515720f905ab75c2b4"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.375717 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-djfvl" event={"ID":"2f7517f5-ede0-45bc-8407-d24d5e173301","Type":"ContainerStarted","Data":"c3d219bdfa7d8401e3047fb5b58022b0ebf1ae40916c19aebca65d8cfd0470b2"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.385023 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" event={"ID":"f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8","Type":"ContainerStarted","Data":"ac088931a1955799f081334a5d795a8ce8dc5c8cc4589ef9d1a8e901f67159dd"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.387898 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh" event={"ID":"f576665e-17f5-4704-bd20-5debf9fb8612","Type":"ContainerStarted","Data":"8dd30b8266c4c7fbfdf77f1c404708a8b6e9f8d12293e2e420fefa540a94276f"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.396682 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nqkpb" event={"ID":"97dfcf45-5b8b-4c0a-9738-6568da5ec035","Type":"ContainerStarted","Data":"bcd8d5577a45e5e0252779858289699c7980230cbdd233dc8041aecbdae00608"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.396875 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:42 crc kubenswrapper[4717]: E0217 14:54:42.397258 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:42.897230618 +0000 UTC m=+149.313071244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.406031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" event={"ID":"efd2934c-3d71-447e-bf59-2fd6a1dd6966","Type":"ContainerStarted","Data":"a47edbe477df70a0abcbc06cffd4da1d82dc24610b337052e02b7a113cbb419b"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.406097 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" event={"ID":"efd2934c-3d71-447e-bf59-2fd6a1dd6966","Type":"ContainerStarted","Data":"90c5182edf255ee6667be121fe5d086461993770a2eafb59c5e08e0bac5a1f4e"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.407596 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-whb99" event={"ID":"e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed","Type":"ContainerStarted","Data":"6fd7a49d4de85418630824d65f6b5b9a4d89c29aff9c62fdf4414ba4cbd6462a"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.413733 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6ld9l" event={"ID":"61c25e66-efcf-43a0-bcc4-922e00481359","Type":"ContainerStarted","Data":"b30dbb0f9d2f72a41b701d682aa0ee178933e930eb4d1b811669eafc4f11748e"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.416288 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" event={"ID":"3de82a84-a13d-4bf9-a242-2365193a9d62","Type":"ContainerStarted","Data":"1a7e3ac4af5f5cc7fecacefb533761043e00ad72fcc6090fff198d3a44acbb46"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.416353 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" event={"ID":"3de82a84-a13d-4bf9-a242-2365193a9d62","Type":"ContainerStarted","Data":"bab53059d4baae61d17798295496cfb80892a86d067aa001081431075df3fb7b"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.434636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xfb64" event={"ID":"44d6dc49-9b2a-408c-a83d-1d5baf26bba4","Type":"ContainerStarted","Data":"a7777628516cec541124f0449b08c96812e9dca7213dd2105b5e7bf6b73e1bfd"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.442913 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t" event={"ID":"cf7e2512-6897-4c4d-b28f-c79deecee58b","Type":"ContainerStarted","Data":"fd9324c694e9b2575e221f3518607bb58d6d10a84e4ac074ea9d62c59bc229d9"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.455401 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" event={"ID":"dc1eca03-6063-41fd-bcab-799db27b8f23","Type":"ContainerStarted","Data":"61c9c381b8144c2b49e4108050e79d615e064f1137a317295420772478a4133e"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.460639 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-r2gmn" event={"ID":"5aed9e9a-0eff-476d-bf5c-e268f4e16e06","Type":"ContainerStarted","Data":"84e98117b73c40e4bbde1e1b20c34cab5c6a419b74423b33ba0ee1d16406a66b"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.461541 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-r2gmn" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.467141 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-r2gmn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.467243 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r2gmn" podUID="5aed9e9a-0eff-476d-bf5c-e268f4e16e06" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.475494 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" event={"ID":"a92032d2-9ec0-4dfd-8545-aa72231c1ca9","Type":"ContainerStarted","Data":"507785876b08478aa0ac90d740b1035ebc839e822c0ca6fb63704da234cf36a4"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.475951 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6ld9l" podStartSLOduration=123.475922975 podStartE2EDuration="2m3.475922975s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:42.439899456 +0000 UTC m=+148.855739952" watchObservedRunningTime="2026-02-17 14:54:42.475922975 +0000 UTC m=+148.891763451" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.477899 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xfb64" podStartSLOduration=124.477884701 podStartE2EDuration="2m4.477884701s" podCreationTimestamp="2026-02-17 14:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:42.475482273 +0000 UTC m=+148.891322769" watchObservedRunningTime="2026-02-17 14:54:42.477884701 +0000 UTC m=+148.893725177" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.488843 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kdfsw" event={"ID":"1416b738-4058-4e5b-be77-62d0b9172a13","Type":"ContainerStarted","Data":"1523caa35ab7a627d6245eff13ba9a1aacb59a1a95481749440723ba8015e618"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.488919 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kdfsw" event={"ID":"1416b738-4058-4e5b-be77-62d0b9172a13","Type":"ContainerStarted","Data":"d4dcfcf7d207d589952a3593311daeba3f3d701aadb191710dfd42858129558c"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.490180 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.492929 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" event={"ID":"2fbfa975-4e24-4213-8368-ba1af6b39e21","Type":"ContainerStarted","Data":"badce36dbfe5f4026d72dc734e534b9ea913545200d7dc87f4d0a7474332f1c2"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.495492 4717 patch_prober.go:28] interesting pod/console-operator-58897d9998-kdfsw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.495555 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kdfsw" podUID="1416b738-4058-4e5b-be77-62d0b9172a13" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.495907 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" event={"ID":"a18b1991-622b-438d-b168-91e5a21ad0f0","Type":"ContainerStarted","Data":"9027ca665f9afc817598ab189d03f50cd2b10cb66d42a8011daaef8cc9c9384a"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.496012 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-17 14:49:41 +0000 UTC, rotation deadline is 2026-11-04 14:24:09.621405946 +0000 UTC Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.496029 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6239h29m27.125379042s for next certificate rotation Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.497679 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" event={"ID":"53a28e17-d55d-4208-a23f-1254796e789f","Type":"ContainerStarted","Data":"8cee442b619021c46cc0476dbb526b7bbd39c5f706fc4bed9da8bb8087403e8f"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.498630 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:42 crc kubenswrapper[4717]: E0217 14:54:42.506298 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.006277114 +0000 UTC m=+149.422117580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.508871 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-77x5j" event={"ID":"9c4035af-26c4-4c7d-85f6-30fa7b167a43","Type":"ContainerStarted","Data":"3ac9bd55bab36ff52f0db723a80d98c623f6f9f122882ff993077e876b6ec9c4"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.516001 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" event={"ID":"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec","Type":"ContainerStarted","Data":"306f59161bf80eaf8318f1967de5ecba450341a78a594abc2223d4d0bf36052b"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.523156 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" event={"ID":"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a","Type":"ContainerStarted","Data":"b79a7613c3f0f14618c91ed84b16302e920a0301613f310c13c15ea057f45dcc"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.531636 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-kdfsw" podStartSLOduration=123.531607531 podStartE2EDuration="2m3.531607531s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:42.522960707 +0000 UTC m=+148.938801193" watchObservedRunningTime="2026-02-17 14:54:42.531607531 +0000 UTC m=+148.947448007" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.532882 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-r2gmn" podStartSLOduration=123.532871877 podStartE2EDuration="2m3.532871877s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:42.496382494 +0000 UTC m=+148.912223000" watchObservedRunningTime="2026-02-17 14:54:42.532871877 +0000 UTC m=+148.948712363" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.555253 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" event={"ID":"90118620-9ff2-42f8-a7d4-398df433f1e6","Type":"ContainerStarted","Data":"e7091eca2b2f725544c9a1f2e0e3797eeccd78948206998186f65e39faa86e20"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.555310 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" event={"ID":"90118620-9ff2-42f8-a7d4-398df433f1e6","Type":"ContainerStarted","Data":"fa3049a736ec060bea601ee1a8a09da058872f10babc37c9b073919431e38359"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.559791 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fbqb2" event={"ID":"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4","Type":"ContainerStarted","Data":"d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.559820 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fbqb2" event={"ID":"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4","Type":"ContainerStarted","Data":"78debf03a6a3d7126d4f3abfe5cfcc54f173f1b6a23642d0379409c9c1b4b8c3"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.562006 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-b649c" event={"ID":"b10850bc-6877-4ad3-bda3-d9307748b7e2","Type":"ContainerStarted","Data":"b0594333b8ed02e6a96104d6ccb9936fbcb346f0e2e97ada601f35b77652adb3"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.589672 4717 patch_prober.go:28] interesting pod/router-default-5444994796-jp5jl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:54:42 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 17 14:54:42 crc kubenswrapper[4717]: [+]process-running ok Feb 17 14:54:42 crc kubenswrapper[4717]: healthz check failed Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.589726 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jp5jl" podUID="96493978-ef25-4f82-ab8f-29c966a22ac6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.591568 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-fbqb2" podStartSLOduration=123.591540947 podStartE2EDuration="2m3.591540947s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:42.589882941 +0000 UTC m=+149.005723427" watchObservedRunningTime="2026-02-17 14:54:42.591540947 +0000 UTC m=+149.007381423" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.600976 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:42 crc kubenswrapper[4717]: E0217 14:54:42.602151 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.102126837 +0000 UTC m=+149.517967323 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.610500 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" event={"ID":"9b64270c-ff7e-4ad1-ad73-98c5e14346e0","Type":"ContainerStarted","Data":"06b84ad88948ee75894e8b91f0d9cd9dc49965c13ce087042edbe10ffe02cd94"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.628134 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" event={"ID":"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc","Type":"ContainerStarted","Data":"a056a87622271e106d536a267c52b94555a3df1436a48eeecfb321b7cd58d497"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.632116 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" event={"ID":"f7daba40-adba-48a1-a1c1-139a096354b2","Type":"ContainerStarted","Data":"7713edccb2a928c4ad7a6367ec5c98f1676772abbb6f877b3242cfd1310ae19d"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.634621 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9rzh4" event={"ID":"8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a","Type":"ContainerStarted","Data":"fde3c0923fad54f184d4549cabdec24fd0437351a3bcab119dbe790d7cd23650"} Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.671418 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.687000 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.705075 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:42 crc kubenswrapper[4717]: E0217 14:54:42.710700 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.210656899 +0000 UTC m=+149.626497595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.725216 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-smmgb" podStartSLOduration=123.72519301 podStartE2EDuration="2m3.72519301s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:42.723704368 +0000 UTC m=+149.139544874" watchObservedRunningTime="2026-02-17 14:54:42.72519301 +0000 UTC m=+149.141033496" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.725576 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-plkvk" podStartSLOduration=123.725568931 podStartE2EDuration="2m3.725568931s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:42.649627561 +0000 UTC m=+149.065468037" watchObservedRunningTime="2026-02-17 14:54:42.725568931 +0000 UTC m=+149.141409407" Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.808841 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:42 crc kubenswrapper[4717]: E0217 14:54:42.809459 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.309428794 +0000 UTC m=+149.725269280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:42 crc kubenswrapper[4717]: I0217 14:54:42.912522 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:42 crc kubenswrapper[4717]: E0217 14:54:42.925464 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.425375715 +0000 UTC m=+149.841216191 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.015182 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.015286 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.515252909 +0000 UTC m=+149.931093385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.016557 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.017656 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.517622896 +0000 UTC m=+149.933463522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.118571 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.119196 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.61915778 +0000 UTC m=+150.034998256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.214614 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.214690 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.216122 4717 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-6h5hj container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.216661 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" podUID="dc1eca03-6063-41fd-bcab-799db27b8f23" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.221197 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.221690 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.721666841 +0000 UTC m=+150.137507317 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.322865 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.323108 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.82306241 +0000 UTC m=+150.238902886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.323276 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.323663 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.823655407 +0000 UTC m=+150.239495883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.426357 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.426550 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.926518438 +0000 UTC m=+150.342358914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.427220 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.427686 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:43.927669181 +0000 UTC m=+150.343509657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.528564 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.529004 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:44.028975258 +0000 UTC m=+150.444815724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.571653 4717 patch_prober.go:28] interesting pod/router-default-5444994796-jp5jl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:54:43 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 17 14:54:43 crc kubenswrapper[4717]: [+]process-running ok Feb 17 14:54:43 crc kubenswrapper[4717]: healthz check failed Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.571724 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jp5jl" podUID="96493978-ef25-4f82-ab8f-29c966a22ac6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.630787 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.631279 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:44.131257942 +0000 UTC m=+150.547098419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.659627 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh" event={"ID":"f576665e-17f5-4704-bd20-5debf9fb8612","Type":"ContainerStarted","Data":"048f334f83abdc8b5fd8cf110a15396178ba4112eaa92182b6be89f66e036848"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.669267 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t" event={"ID":"cf7e2512-6897-4c4d-b28f-c79deecee58b","Type":"ContainerStarted","Data":"47a919687867202e5f11cec889cb3d3c00dd96b20b280813e1bfa355af60329f"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.669392 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t" event={"ID":"cf7e2512-6897-4c4d-b28f-c79deecee58b","Type":"ContainerStarted","Data":"1e97cef7d16aec52d0b05bcf53315421a51a8702ac6f63c26181b7547bed82d7"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.671300 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" event={"ID":"3de82a84-a13d-4bf9-a242-2365193a9d62","Type":"ContainerStarted","Data":"928b2d8db7f72162266e57c7d658cd2f190e7dd7219d079b2235f745f3e172fe"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.688921 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" event={"ID":"72da7396-e722-4b51-9dbb-2f32b9f490b4","Type":"ContainerStarted","Data":"f682683b8f062f002c3ada4351f5e8bc01dc0a9b407066ec765369db7fa8c473"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.689968 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.692575 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" event={"ID":"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc","Type":"ContainerStarted","Data":"c21df0c37f6e40484ded69812a6546c5ecec4a432425b4ee8a455f89ebd46abf"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.692612 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" event={"ID":"5ab2c5e9-8f67-4615-b1a0-60fa635a87fc","Type":"ContainerStarted","Data":"8ccd7bd197f7dbc5337038d8d00d78b32f60b9542d56b78cc81e09323f95dbbb"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.708893 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-77x5j" event={"ID":"9c4035af-26c4-4c7d-85f6-30fa7b167a43","Type":"ContainerStarted","Data":"5ed5ec2cfcf893372074a4861962395ff8a2424b00b2275b7088451437c619fb"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.711519 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-b649c" event={"ID":"b10850bc-6877-4ad3-bda3-d9307748b7e2","Type":"ContainerStarted","Data":"a4538d088e1f558c4eb679a0db4804842efdc9b58382a71732b567b9e69fd169"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.728333 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vxkqh" podStartSLOduration=124.728304399 podStartE2EDuration="2m4.728304399s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:43.727510057 +0000 UTC m=+150.143350533" watchObservedRunningTime="2026-02-17 14:54:43.728304399 +0000 UTC m=+150.144144875" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.730455 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" event={"ID":"a18b1991-622b-438d-b168-91e5a21ad0f0","Type":"ContainerStarted","Data":"436d814e7fa2f313e24f1b5fed9b43ab938b7f90dbf008d8d6bfc412753044de"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.733106 4717 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zxwxs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.733179 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" podUID="a18b1991-622b-438d-b168-91e5a21ad0f0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.734167 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.734708 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.736127 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:44.23610347 +0000 UTC m=+150.651943946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.754476 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" event={"ID":"53a28e17-d55d-4208-a23f-1254796e789f","Type":"ContainerStarted","Data":"766293f12589c0f02f82ec27d7301fd1baee7fa0fe3ca8325a9ea99aff576b46"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.755533 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.760940 4717 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-gch26 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.761000 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" podUID="53a28e17-d55d-4208-a23f-1254796e789f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.788892 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" event={"ID":"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a","Type":"ContainerStarted","Data":"984cdc1bdeead0a7608d6df5661d47f99ce597623911b388a43dea3120c99a44"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.836568 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.838365 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:44.338349053 +0000 UTC m=+150.754189529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.869929 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" event={"ID":"057d447b-4684-4f9c-b3e3-357677289cb5","Type":"ContainerStarted","Data":"8fd28ab48ac42483a54b6fe8974f5a55ebc52905e01ae605f07bfea47186301c"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.870009 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" event={"ID":"057d447b-4684-4f9c-b3e3-357677289cb5","Type":"ContainerStarted","Data":"a4732a2bdece5c801a7de7f2984c7e4506f35f50162ce639d0697c9eecab5ce1"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.870032 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.870044 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-whb99" event={"ID":"e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed","Type":"ContainerStarted","Data":"c89086e25a4434bd2f067dd951a10fc3c77a15ce3ab1269ba7ba0b4dabe090ce"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.890818 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" event={"ID":"2fbfa975-4e24-4213-8368-ba1af6b39e21","Type":"ContainerStarted","Data":"6dca40a0f9057be15062a06f09179c08d0369bc7da7a166c9c75fa49c718c09a"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.907357 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" event={"ID":"1fc864b6-ff33-49d1-bed4-c1e32a47c747","Type":"ContainerStarted","Data":"4c6e02bf49cc95e80013d9f5b2ef460628ea12da184cb875ad2af481bab6458b"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.907474 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" event={"ID":"1fc864b6-ff33-49d1-bed4-c1e32a47c747","Type":"ContainerStarted","Data":"c5c13194a388b2c8998c86c12554a2dd5d16546159565df32d5c19541a80cd62"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.911505 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" event={"ID":"a92032d2-9ec0-4dfd-8545-aa72231c1ca9","Type":"ContainerStarted","Data":"00e9543eeb7d4568ce34f4ac085b2c9989fcadee199a48e1135f445f059612ee"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.922958 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6vx9t" podStartSLOduration=124.922933987 podStartE2EDuration="2m4.922933987s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:43.921520387 +0000 UTC m=+150.337360863" watchObservedRunningTime="2026-02-17 14:54:43.922933987 +0000 UTC m=+150.338774463" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.924239 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9rzh4" event={"ID":"8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a","Type":"ContainerStarted","Data":"180cc8314a6b4b383511db3c12d7d1036573bb587dee9938aea2e1d85aa38fd0"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.924523 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-67r2l" podStartSLOduration=124.924516312 podStartE2EDuration="2m4.924516312s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:43.836941384 +0000 UTC m=+150.252781860" watchObservedRunningTime="2026-02-17 14:54:43.924516312 +0000 UTC m=+150.340356788" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.942596 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nqkpb" event={"ID":"97dfcf45-5b8b-4c0a-9738-6568da5ec035","Type":"ContainerStarted","Data":"6c3c11db2f16529db98d4a67a86d6f5181842176d5af63c074f4d92c93ef9845"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.946050 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:43 crc kubenswrapper[4717]: E0217 14:54:43.947720 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:44.447689398 +0000 UTC m=+150.863529874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.961788 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-b649c" podStartSLOduration=124.961742155 podStartE2EDuration="2m4.961742155s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:43.95906915 +0000 UTC m=+150.374909626" watchObservedRunningTime="2026-02-17 14:54:43.961742155 +0000 UTC m=+150.377582631" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.968283 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" event={"ID":"f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8","Type":"ContainerStarted","Data":"ab17f0dc4bc5e759fa3396a5faf0407c5b5b1aa54d3ed5b33150fe21c76b2bd7"} Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.969041 4717 patch_prober.go:28] interesting pod/console-operator-58897d9998-kdfsw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.969114 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kdfsw" podUID="1416b738-4058-4e5b-be77-62d0b9172a13" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.971871 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.972883 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-r2gmn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.972919 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r2gmn" podUID="5aed9e9a-0eff-476d-bf5c-e268f4e16e06" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.982340 4717 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7268l container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 17 14:54:43 crc kubenswrapper[4717]: I0217 14:54:43.982438 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" podUID="f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.004045 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" podStartSLOduration=125.004016872 podStartE2EDuration="2m5.004016872s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.003069965 +0000 UTC m=+150.418910471" watchObservedRunningTime="2026-02-17 14:54:44.004016872 +0000 UTC m=+150.419857338" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.030200 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-77x5j" podStartSLOduration=125.030173062 podStartE2EDuration="2m5.030173062s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.028626638 +0000 UTC m=+150.444467114" watchObservedRunningTime="2026-02-17 14:54:44.030173062 +0000 UTC m=+150.446013538" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.050440 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.071041 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:44.571011618 +0000 UTC m=+150.986852094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.080347 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dlhxh" podStartSLOduration=125.080319491 podStartE2EDuration="2m5.080319491s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.078533281 +0000 UTC m=+150.494373757" watchObservedRunningTime="2026-02-17 14:54:44.080319491 +0000 UTC m=+150.496159967" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.127592 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" podStartSLOduration=125.127562538 podStartE2EDuration="2m5.127562538s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.114524649 +0000 UTC m=+150.530365125" watchObservedRunningTime="2026-02-17 14:54:44.127562538 +0000 UTC m=+150.543403014" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.150721 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" podStartSLOduration=125.150687433 podStartE2EDuration="2m5.150687433s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.148560963 +0000 UTC m=+150.564401459" watchObservedRunningTime="2026-02-17 14:54:44.150687433 +0000 UTC m=+150.566527919" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.169685 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.170252 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:44.670229116 +0000 UTC m=+151.086069592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.230290 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fml67" podStartSLOduration=125.230268075 podStartE2EDuration="2m5.230268075s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.186586349 +0000 UTC m=+150.602426845" watchObservedRunningTime="2026-02-17 14:54:44.230268075 +0000 UTC m=+150.646108551" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.231763 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" podStartSLOduration=126.231751787 podStartE2EDuration="2m6.231751787s" podCreationTimestamp="2026-02-17 14:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.230497511 +0000 UTC m=+150.646337997" watchObservedRunningTime="2026-02-17 14:54:44.231751787 +0000 UTC m=+150.647592263" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.253061 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnnjg" podStartSLOduration=125.25304382 podStartE2EDuration="2m5.25304382s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.252635048 +0000 UTC m=+150.668475524" watchObservedRunningTime="2026-02-17 14:54:44.25304382 +0000 UTC m=+150.668884316" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.272003 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.272535 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:44.772514411 +0000 UTC m=+151.188354887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.298702 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" podStartSLOduration=125.298678021 podStartE2EDuration="2m5.298678021s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.298234528 +0000 UTC m=+150.714075014" watchObservedRunningTime="2026-02-17 14:54:44.298678021 +0000 UTC m=+150.714518497" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.349641 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nqkpb" podStartSLOduration=8.349612533 podStartE2EDuration="8.349612533s" podCreationTimestamp="2026-02-17 14:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.346601047 +0000 UTC m=+150.762441523" watchObservedRunningTime="2026-02-17 14:54:44.349612533 +0000 UTC m=+150.765453009" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.373843 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.374333 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:44.874313002 +0000 UTC m=+151.290153478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.412187 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-9rzh4" podStartSLOduration=125.412158373 podStartE2EDuration="2m5.412158373s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.40994627 +0000 UTC m=+150.825786756" watchObservedRunningTime="2026-02-17 14:54:44.412158373 +0000 UTC m=+150.827998849" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.414069 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" podStartSLOduration=125.414061607 podStartE2EDuration="2m5.414061607s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.383511252 +0000 UTC m=+150.799351738" watchObservedRunningTime="2026-02-17 14:54:44.414061607 +0000 UTC m=+150.829902083" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.474659 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" podStartSLOduration=125.474636181 podStartE2EDuration="2m5.474636181s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.439016803 +0000 UTC m=+150.854857289" watchObservedRunningTime="2026-02-17 14:54:44.474636181 +0000 UTC m=+150.890476657" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.475470 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.475906 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:44.975888126 +0000 UTC m=+151.391728602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.557751 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4kndh" podStartSLOduration=125.557719832 podStartE2EDuration="2m5.557719832s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.475500325 +0000 UTC m=+150.891340801" watchObservedRunningTime="2026-02-17 14:54:44.557719832 +0000 UTC m=+150.973560308" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.559182 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hm8n" podStartSLOduration=125.559176913 podStartE2EDuration="2m5.559176913s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.554390168 +0000 UTC m=+150.970230654" watchObservedRunningTime="2026-02-17 14:54:44.559176913 +0000 UTC m=+150.975017389" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.570191 4717 patch_prober.go:28] interesting pod/router-default-5444994796-jp5jl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:54:44 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 17 14:54:44 crc kubenswrapper[4717]: [+]process-running ok Feb 17 14:54:44 crc kubenswrapper[4717]: healthz check failed Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.570286 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jp5jl" podUID="96493978-ef25-4f82-ab8f-29c966a22ac6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.576894 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.577106 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.07706171 +0000 UTC m=+151.492902186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.577322 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.577707 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.077697568 +0000 UTC m=+151.493538044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.601518 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hd2cz" podStartSLOduration=125.601490661 podStartE2EDuration="2m5.601490661s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.598559718 +0000 UTC m=+151.014400204" watchObservedRunningTime="2026-02-17 14:54:44.601490661 +0000 UTC m=+151.017331127" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.662704 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-65gm9" podStartSLOduration=125.662678203 podStartE2EDuration="2m5.662678203s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:44.661687375 +0000 UTC m=+151.077527851" watchObservedRunningTime="2026-02-17 14:54:44.662678203 +0000 UTC m=+151.078518669" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.678351 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.678579 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.178538612 +0000 UTC m=+151.594379088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.678648 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.679071 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.179057786 +0000 UTC m=+151.594898262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.780040 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.780307 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.28026302 +0000 UTC m=+151.696103486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.780380 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.780857 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.280839317 +0000 UTC m=+151.696679793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.882205 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.882458 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.382419282 +0000 UTC m=+151.798259758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.882624 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.883045 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.383036589 +0000 UTC m=+151.798877065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.983641 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.983973 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.483932834 +0000 UTC m=+151.899773310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.984361 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:44 crc kubenswrapper[4717]: E0217 14:54:44.984807 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.484796199 +0000 UTC m=+151.900636675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.989726 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-whb99" event={"ID":"e2e4b3a0-9d71-4811-bb2f-3ffa58f776ed","Type":"ContainerStarted","Data":"2081cd9a7f982c68410e31866c178dfb64f6c1fb6c6520a618b21455a4cd2271"} Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.989916 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-whb99" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.992253 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" event={"ID":"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec","Type":"ContainerStarted","Data":"718835a03db6f6857c6a99eb01b9df7a1896936e0b0a9cdab0d432ca975d7768"} Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.996663 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9rzh4" event={"ID":"8916f3f2-fa5e-4ca2-b8f3-5d86fe3ae53a","Type":"ContainerStarted","Data":"531e756b5a2933877a783dd3cc06c57c8b02887af41d71db28cddfbf8cad6672"} Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.997346 4717 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7268l container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.997367 4717 patch_prober.go:28] interesting pod/console-operator-58897d9998-kdfsw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.997392 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" podUID="f1c67bd5-6cf9-46d4-b614-3d056ca8d8e8" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.997411 4717 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zxwxs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.997472 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" podUID="a18b1991-622b-438d-b168-91e5a21ad0f0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.997420 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kdfsw" podUID="1416b738-4058-4e5b-be77-62d0b9172a13" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.998190 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.998267 4717 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-gch26 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.998287 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" podUID="53a28e17-d55d-4208-a23f-1254796e789f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.998632 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-r2gmn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 17 14:54:44 crc kubenswrapper[4717]: I0217 14:54:44.998682 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r2gmn" podUID="5aed9e9a-0eff-476d-bf5c-e268f4e16e06" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.000287 4717 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bxg9v container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.000314 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" podUID="2fbfa975-4e24-4213-8368-ba1af6b39e21" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.029458 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-whb99" podStartSLOduration=9.029427752 podStartE2EDuration="9.029427752s" podCreationTimestamp="2026-02-17 14:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:45.02725421 +0000 UTC m=+151.443094686" watchObservedRunningTime="2026-02-17 14:54:45.029427752 +0000 UTC m=+151.445268228" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.085661 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:45 crc kubenswrapper[4717]: E0217 14:54:45.085971 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.585926471 +0000 UTC m=+152.001766947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.086979 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:45 crc kubenswrapper[4717]: E0217 14:54:45.094474 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.594434062 +0000 UTC m=+152.010274538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.151145 4717 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6ds8r container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.151759 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" podUID="72da7396-e722-4b51-9dbb-2f32b9f490b4" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.151873 4717 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6ds8r container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.151944 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" podUID="72da7396-e722-4b51-9dbb-2f32b9f490b4" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.190141 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:45 crc kubenswrapper[4717]: E0217 14:54:45.191185 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.691154329 +0000 UTC m=+152.106994805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.292635 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:45 crc kubenswrapper[4717]: E0217 14:54:45.293280 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.793251467 +0000 UTC m=+152.209091943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.394447 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:45 crc kubenswrapper[4717]: E0217 14:54:45.394689 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.894654257 +0000 UTC m=+152.310494733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.394788 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:45 crc kubenswrapper[4717]: E0217 14:54:45.395214 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.895205773 +0000 UTC m=+152.311046249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.495953 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:45 crc kubenswrapper[4717]: E0217 14:54:45.496343 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:45.996325365 +0000 UTC m=+152.412165841 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.573338 4717 patch_prober.go:28] interesting pod/router-default-5444994796-jp5jl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:54:45 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 17 14:54:45 crc kubenswrapper[4717]: [+]process-running ok Feb 17 14:54:45 crc kubenswrapper[4717]: healthz check failed Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.573756 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jp5jl" podUID="96493978-ef25-4f82-ab8f-29c966a22ac6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.598377 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:45 crc kubenswrapper[4717]: E0217 14:54:45.598861 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:46.098838836 +0000 UTC m=+152.514679302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.699648 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:45 crc kubenswrapper[4717]: E0217 14:54:45.700105 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:46.200065671 +0000 UTC m=+152.615906157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.801537 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.801608 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.801654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.801679 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.801751 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:45 crc kubenswrapper[4717]: E0217 14:54:45.802119 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:46.302057507 +0000 UTC m=+152.717898173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.803969 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.823579 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.824479 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.828801 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.905910 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:45 crc kubenswrapper[4717]: E0217 14:54:45.906321 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:46.406304477 +0000 UTC m=+152.822144953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.971539 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.974364 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:54:45 crc kubenswrapper[4717]: I0217 14:54:45.981537 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.007820 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:46 crc kubenswrapper[4717]: E0217 14:54:46.008315 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:46.508295934 +0000 UTC m=+152.924136410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.043736 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" event={"ID":"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec","Type":"ContainerStarted","Data":"322f0b3ba7b10be0875e9f88d9fe093d2433ad9f46676f7cecc765efdead9a93"} Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.044984 4717 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bxg9v container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.045138 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" podUID="2fbfa975-4e24-4213-8368-ba1af6b39e21" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.068028 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gch26" Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.092764 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7268l" Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.109422 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:46 crc kubenswrapper[4717]: E0217 14:54:46.110856 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:46.610837616 +0000 UTC m=+153.026678092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.221954 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:46 crc kubenswrapper[4717]: E0217 14:54:46.222987 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:46.722967989 +0000 UTC m=+153.138808465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.328845 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:46 crc kubenswrapper[4717]: E0217 14:54:46.329406 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:46.829357351 +0000 UTC m=+153.245197827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.439165 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:46 crc kubenswrapper[4717]: E0217 14:54:46.439705 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:46.939685363 +0000 UTC m=+153.355525839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.540413 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:46 crc kubenswrapper[4717]: E0217 14:54:46.547443 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:47.047404001 +0000 UTC m=+153.463244477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.582122 4717 patch_prober.go:28] interesting pod/router-default-5444994796-jp5jl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:54:46 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 17 14:54:46 crc kubenswrapper[4717]: [+]process-running ok Feb 17 14:54:46 crc kubenswrapper[4717]: healthz check failed Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.582182 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jp5jl" podUID="96493978-ef25-4f82-ab8f-29c966a22ac6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.652013 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:46 crc kubenswrapper[4717]: E0217 14:54:46.652445 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:47.152430254 +0000 UTC m=+153.568270730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.761684 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:46 crc kubenswrapper[4717]: E0217 14:54:46.762180 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:47.262159459 +0000 UTC m=+153.677999935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.836138 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6ds8r" Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.865664 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:46 crc kubenswrapper[4717]: E0217 14:54:46.866512 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:47.366487672 +0000 UTC m=+153.782328148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:46 crc kubenswrapper[4717]: I0217 14:54:46.968554 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:46 crc kubenswrapper[4717]: E0217 14:54:46.968950 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:47.468927921 +0000 UTC m=+153.884768397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.047226 4717 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zxwxs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.047360 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" podUID="a18b1991-622b-438d-b168-91e5a21ad0f0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.076104 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:47 crc kubenswrapper[4717]: E0217 14:54:47.076675 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:47.576654239 +0000 UTC m=+153.992494715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.144690 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.145614 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.148612 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.150729 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.151055 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 17 14:54:47 crc kubenswrapper[4717]: W0217 14:54:47.163303 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-ad6916f0210e9612b72c6e59470041f0a2f433c11d7c28d645d30363e46e7b41 WatchSource:0}: Error finding container ad6916f0210e9612b72c6e59470041f0a2f433c11d7c28d645d30363e46e7b41: Status 404 returned error can't find the container with id ad6916f0210e9612b72c6e59470041f0a2f433c11d7c28d645d30363e46e7b41 Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.176836 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:47 crc kubenswrapper[4717]: E0217 14:54:47.177432 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:47.677407491 +0000 UTC m=+154.093247967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.185053 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" event={"ID":"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec","Type":"ContainerStarted","Data":"69a373d7a66b25803d84a400edab37747a668cde087aebd43165eb2826c3d615"} Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.237223 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1b1c62c47bf77c3797c709dda9e30661eb66896ee153977d3d4509a087c6674c"} Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.291883 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/543642a2-835c-42b9-8912-3e95230fafa4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"543642a2-835c-42b9-8912-3e95230fafa4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.292408 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/543642a2-835c-42b9-8912-3e95230fafa4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"543642a2-835c-42b9-8912-3e95230fafa4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.292457 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:47 crc kubenswrapper[4717]: E0217 14:54:47.292904 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:47.792890729 +0000 UTC m=+154.208731205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.307604 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.310010 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.357379 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.388219 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-49dn7"] Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.389482 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.397699 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.398107 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/543642a2-835c-42b9-8912-3e95230fafa4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"543642a2-835c-42b9-8912-3e95230fafa4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.398195 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/543642a2-835c-42b9-8912-3e95230fafa4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"543642a2-835c-42b9-8912-3e95230fafa4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:54:47 crc kubenswrapper[4717]: E0217 14:54:47.399305 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:47.89928128 +0000 UTC m=+154.315121756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.400003 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/543642a2-835c-42b9-8912-3e95230fafa4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"543642a2-835c-42b9-8912-3e95230fafa4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.404171 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.408069 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-49dn7"] Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.473351 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/543642a2-835c-42b9-8912-3e95230fafa4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"543642a2-835c-42b9-8912-3e95230fafa4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.499174 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.499281 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59890efa-3bd7-474e-b962-99c705159847-catalog-content\") pod \"community-operators-49dn7\" (UID: \"59890efa-3bd7-474e-b962-99c705159847\") " pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.499322 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjqqc\" (UniqueName: \"kubernetes.io/projected/59890efa-3bd7-474e-b962-99c705159847-kube-api-access-pjqqc\") pod \"community-operators-49dn7\" (UID: \"59890efa-3bd7-474e-b962-99c705159847\") " pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.499381 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59890efa-3bd7-474e-b962-99c705159847-utilities\") pod \"community-operators-49dn7\" (UID: \"59890efa-3bd7-474e-b962-99c705159847\") " pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:54:47 crc kubenswrapper[4717]: E0217 14:54:47.500900 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:48.000884806 +0000 UTC m=+154.416725282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.559215 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tbvbf"] Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.560639 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.565604 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.567790 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.579290 4717 patch_prober.go:28] interesting pod/router-default-5444994796-jp5jl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:54:47 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 17 14:54:47 crc kubenswrapper[4717]: [+]process-running ok Feb 17 14:54:47 crc kubenswrapper[4717]: healthz check failed Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.579349 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jp5jl" podUID="96493978-ef25-4f82-ab8f-29c966a22ac6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.605119 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.605451 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59890efa-3bd7-474e-b962-99c705159847-catalog-content\") pod \"community-operators-49dn7\" (UID: \"59890efa-3bd7-474e-b962-99c705159847\") " pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.605482 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjqqc\" (UniqueName: \"kubernetes.io/projected/59890efa-3bd7-474e-b962-99c705159847-kube-api-access-pjqqc\") pod \"community-operators-49dn7\" (UID: \"59890efa-3bd7-474e-b962-99c705159847\") " pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.605523 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59890efa-3bd7-474e-b962-99c705159847-utilities\") pod \"community-operators-49dn7\" (UID: \"59890efa-3bd7-474e-b962-99c705159847\") " pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.606064 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59890efa-3bd7-474e-b962-99c705159847-utilities\") pod \"community-operators-49dn7\" (UID: \"59890efa-3bd7-474e-b962-99c705159847\") " pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:54:47 crc kubenswrapper[4717]: E0217 14:54:47.606163 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:48.106140174 +0000 UTC m=+154.521980650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.606417 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59890efa-3bd7-474e-b962-99c705159847-catalog-content\") pod \"community-operators-49dn7\" (UID: \"59890efa-3bd7-474e-b962-99c705159847\") " pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.648327 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjqqc\" (UniqueName: \"kubernetes.io/projected/59890efa-3bd7-474e-b962-99c705159847-kube-api-access-pjqqc\") pod \"community-operators-49dn7\" (UID: \"59890efa-3bd7-474e-b962-99c705159847\") " pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.648437 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tbvbf"] Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.706566 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgp6p\" (UniqueName: \"kubernetes.io/projected/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-kube-api-access-dgp6p\") pod \"certified-operators-tbvbf\" (UID: \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\") " pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.706626 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-catalog-content\") pod \"certified-operators-tbvbf\" (UID: \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\") " pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.706664 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-utilities\") pod \"certified-operators-tbvbf\" (UID: \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\") " pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.706707 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:47 crc kubenswrapper[4717]: E0217 14:54:47.707107 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:48.207072951 +0000 UTC m=+154.622913427 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.728792 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.759863 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r7hv9"] Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.801206 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7hv9"] Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.801368 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.807941 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.808319 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgp6p\" (UniqueName: \"kubernetes.io/projected/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-kube-api-access-dgp6p\") pod \"certified-operators-tbvbf\" (UID: \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\") " pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.808354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-catalog-content\") pod \"certified-operators-tbvbf\" (UID: \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\") " pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.808386 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-utilities\") pod \"certified-operators-tbvbf\" (UID: \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\") " pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.809237 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-utilities\") pod \"certified-operators-tbvbf\" (UID: \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\") " pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:54:47 crc kubenswrapper[4717]: E0217 14:54:47.809328 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:48.309307324 +0000 UTC m=+154.725147790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.809904 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-catalog-content\") pod \"certified-operators-tbvbf\" (UID: \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\") " pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.909686 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.910682 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c492899-c1df-4839-b6db-18634da0dbb6-catalog-content\") pod \"community-operators-r7hv9\" (UID: \"2c492899-c1df-4839-b6db-18634da0dbb6\") " pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.910846 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c492899-c1df-4839-b6db-18634da0dbb6-utilities\") pod \"community-operators-r7hv9\" (UID: \"2c492899-c1df-4839-b6db-18634da0dbb6\") " pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.911002 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-946vm\" (UniqueName: \"kubernetes.io/projected/2c492899-c1df-4839-b6db-18634da0dbb6-kube-api-access-946vm\") pod \"community-operators-r7hv9\" (UID: \"2c492899-c1df-4839-b6db-18634da0dbb6\") " pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:54:47 crc kubenswrapper[4717]: E0217 14:54:47.911311 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:48.41128811 +0000 UTC m=+154.827128586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.914932 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgp6p\" (UniqueName: \"kubernetes.io/projected/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-kube-api-access-dgp6p\") pod \"certified-operators-tbvbf\" (UID: \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\") " pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.916412 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:54:47 crc kubenswrapper[4717]: I0217 14:54:47.977148 4717 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.006240 4717 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-17T14:54:47.977175455Z","Handler":null,"Name":""} Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.013162 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6lqj9"] Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.013847 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:48 crc kubenswrapper[4717]: E0217 14:54:48.014028 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:54:48.514000457 +0000 UTC m=+154.929840933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.014181 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-946vm\" (UniqueName: \"kubernetes.io/projected/2c492899-c1df-4839-b6db-18634da0dbb6-kube-api-access-946vm\") pod \"community-operators-r7hv9\" (UID: \"2c492899-c1df-4839-b6db-18634da0dbb6\") " pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.014245 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.014283 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c492899-c1df-4839-b6db-18634da0dbb6-catalog-content\") pod \"community-operators-r7hv9\" (UID: \"2c492899-c1df-4839-b6db-18634da0dbb6\") " pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.014308 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c492899-c1df-4839-b6db-18634da0dbb6-utilities\") pod \"community-operators-r7hv9\" (UID: \"2c492899-c1df-4839-b6db-18634da0dbb6\") " pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.014719 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c492899-c1df-4839-b6db-18634da0dbb6-utilities\") pod \"community-operators-r7hv9\" (UID: \"2c492899-c1df-4839-b6db-18634da0dbb6\") " pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.014965 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:54:48 crc kubenswrapper[4717]: E0217 14:54:48.015300 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:54:48.515288284 +0000 UTC m=+154.931128760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mvgrd" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.015588 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c492899-c1df-4839-b6db-18634da0dbb6-catalog-content\") pod \"community-operators-r7hv9\" (UID: \"2c492899-c1df-4839-b6db-18634da0dbb6\") " pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.033896 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6lqj9"] Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.059031 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-946vm\" (UniqueName: \"kubernetes.io/projected/2c492899-c1df-4839-b6db-18634da0dbb6-kube-api-access-946vm\") pod \"community-operators-r7hv9\" (UID: \"2c492899-c1df-4839-b6db-18634da0dbb6\") " pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.091990 4717 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.092031 4717 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.115896 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.116173 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-catalog-content\") pod \"certified-operators-6lqj9\" (UID: \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\") " pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.116201 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hb54\" (UniqueName: \"kubernetes.io/projected/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-kube-api-access-8hb54\") pod \"certified-operators-6lqj9\" (UID: \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\") " pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.116247 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-utilities\") pod \"certified-operators-6lqj9\" (UID: \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\") " pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.153478 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.198036 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-49dn7"] Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.210403 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.220668 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-catalog-content\") pod \"certified-operators-6lqj9\" (UID: \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\") " pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.220717 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hb54\" (UniqueName: \"kubernetes.io/projected/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-kube-api-access-8hb54\") pod \"certified-operators-6lqj9\" (UID: \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\") " pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.220747 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.220784 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-utilities\") pod \"certified-operators-6lqj9\" (UID: \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\") " pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.221551 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-utilities\") pod \"certified-operators-6lqj9\" (UID: \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\") " pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.222166 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-catalog-content\") pod \"certified-operators-6lqj9\" (UID: \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\") " pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.228242 4717 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.228291 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.233717 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.248398 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6h5hj" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.250781 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hb54\" (UniqueName: \"kubernetes.io/projected/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-kube-api-access-8hb54\") pod \"certified-operators-6lqj9\" (UID: \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\") " pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.264231 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"62ab1a330d8d39d03b4b53c8804e6b6fb9c5050d0415dfd6c871d4aa879c616e"} Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.287582 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a" containerID="984cdc1bdeead0a7608d6df5661d47f99ce597623911b388a43dea3120c99a44" exitCode=0 Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.287701 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" event={"ID":"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a","Type":"ContainerDied","Data":"984cdc1bdeead0a7608d6df5661d47f99ce597623911b388a43dea3120c99a44"} Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.302388 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mvgrd\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.335737 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"303624966f65179fb1f84ce14e8d8514806c8331fa4c1552ce6df9d9c1997d05"} Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.335783 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ad6916f0210e9612b72c6e59470041f0a2f433c11d7c28d645d30363e46e7b41"} Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.336458 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.337290 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.367333 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" event={"ID":"8cf47e32-ca08-44e4-b0c2-07c6d53b8aec","Type":"ContainerStarted","Data":"a4a65059c39d56eeb7d311be2cc2c0426f81858f17168330bfb23c3d5ebcaad6"} Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.368305 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.375959 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1356741fa2d5a7857f8d39af585b921528836d2a4d0ad628d531f844f6222ba5"} Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.376008 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4973c2cfca545c498beae164a9584328071fa3deb384f03d3e69b0c0c12feb8f"} Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.389409 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.405523 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xfb64" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.450336 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wxgjx" podStartSLOduration=12.450314025 podStartE2EDuration="12.450314025s" podCreationTimestamp="2026-02-17 14:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:48.449266316 +0000 UTC m=+154.865106802" watchObservedRunningTime="2026-02-17 14:54:48.450314025 +0000 UTC m=+154.866154501" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.584375 4717 patch_prober.go:28] interesting pod/router-default-5444994796-jp5jl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:54:48 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 17 14:54:48 crc kubenswrapper[4717]: [+]process-running ok Feb 17 14:54:48 crc kubenswrapper[4717]: healthz check failed Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.584455 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jp5jl" podUID="96493978-ef25-4f82-ab8f-29c966a22ac6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.671742 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tbvbf"] Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.741384 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7hv9"] Feb 17 14:54:48 crc kubenswrapper[4717]: I0217 14:54:48.896404 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mvgrd"] Feb 17 14:54:48 crc kubenswrapper[4717]: W0217 14:54:48.920266 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d4c20ae_0163_4deb_b965_5e3f7193d9e4.slice/crio-e69fdc03dc95f9838adf2eccaab026dffe9025feeaa658a1fb7778f3cb76e1a4 WatchSource:0}: Error finding container e69fdc03dc95f9838adf2eccaab026dffe9025feeaa658a1fb7778f3cb76e1a4: Status 404 returned error can't find the container with id e69fdc03dc95f9838adf2eccaab026dffe9025feeaa658a1fb7778f3cb76e1a4 Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.153730 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-r2gmn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.153740 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-r2gmn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.154324 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r2gmn" podUID="5aed9e9a-0eff-476d-bf5c-e268f4e16e06" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.154423 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-r2gmn" podUID="5aed9e9a-0eff-476d-bf5c-e268f4e16e06" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.206042 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6lqj9"] Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.347119 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p2gw7"] Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.348469 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.351769 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.363582 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2gw7"] Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.382674 4717 generic.go:334] "Generic (PLEG): container finished" podID="59890efa-3bd7-474e-b962-99c705159847" containerID="a225dcabf286ffcf4d7bcafa646434ca74efc675a58e106c2aa8c93ad38de9b5" exitCode=0 Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.382983 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-49dn7" event={"ID":"59890efa-3bd7-474e-b962-99c705159847","Type":"ContainerDied","Data":"a225dcabf286ffcf4d7bcafa646434ca74efc675a58e106c2aa8c93ad38de9b5"} Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.383039 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-49dn7" event={"ID":"59890efa-3bd7-474e-b962-99c705159847","Type":"ContainerStarted","Data":"2e2f941eab79d6f8bcd434e7f44852b7dcd7af7e66ade07ac979be436204b1a4"} Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.384429 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.385869 4717 generic.go:334] "Generic (PLEG): container finished" podID="2c492899-c1df-4839-b6db-18634da0dbb6" containerID="e79a631850580ab2bc3bbc6199089f7b91027c4065a628c7ba242cd84c5107ec" exitCode=0 Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.385932 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7hv9" event={"ID":"2c492899-c1df-4839-b6db-18634da0dbb6","Type":"ContainerDied","Data":"e79a631850580ab2bc3bbc6199089f7b91027c4065a628c7ba242cd84c5107ec"} Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.385962 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7hv9" event={"ID":"2c492899-c1df-4839-b6db-18634da0dbb6","Type":"ContainerStarted","Data":"da9c76726953dd889e74e3a98b23d0d5aa70eb2ca05ff5f880fa115b3178cc5e"} Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.390952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" event={"ID":"9d4c20ae-0163-4deb-b965-5e3f7193d9e4","Type":"ContainerStarted","Data":"3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1"} Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.391001 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" event={"ID":"9d4c20ae-0163-4deb-b965-5e3f7193d9e4","Type":"ContainerStarted","Data":"e69fdc03dc95f9838adf2eccaab026dffe9025feeaa658a1fb7778f3cb76e1a4"} Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.391043 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.396560 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"543642a2-835c-42b9-8912-3e95230fafa4","Type":"ContainerStarted","Data":"40d0741e16e41d0705650290b06b4ffbbbc76cde2ea2edae9689d0a6c0c81ac7"} Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.396631 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"543642a2-835c-42b9-8912-3e95230fafa4","Type":"ContainerStarted","Data":"0332c4d90e598b79c8ed22383b03122fe791fca4ad8f468d4dcc8ef124bd1074"} Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.403718 4717 generic.go:334] "Generic (PLEG): container finished" podID="c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" containerID="2af5529243f5f061f672834edc3b8da1d29a3608aaed98ee635a3f85867be4a5" exitCode=0 Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.404366 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lqj9" event={"ID":"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5","Type":"ContainerDied","Data":"2af5529243f5f061f672834edc3b8da1d29a3608aaed98ee635a3f85867be4a5"} Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.404496 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lqj9" event={"ID":"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5","Type":"ContainerStarted","Data":"6f607c35489a674d2beabfeae5823313c42ea453789405281cea7a68da85baf6"} Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.412621 4717 generic.go:334] "Generic (PLEG): container finished" podID="0bae9eeb-1b53-44fd-9751-7b03463ceaf9" containerID="ae3f76cad3af7b1848b1007ae54ee1981e9e1cc87d573d066ac5b03ea2d57a3e" exitCode=0 Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.413171 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbvbf" event={"ID":"0bae9eeb-1b53-44fd-9751-7b03463ceaf9","Type":"ContainerDied","Data":"ae3f76cad3af7b1848b1007ae54ee1981e9e1cc87d573d066ac5b03ea2d57a3e"} Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.413237 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbvbf" event={"ID":"0bae9eeb-1b53-44fd-9751-7b03463ceaf9","Type":"ContainerStarted","Data":"391a6d19b35f88460acf3cf60717b553987886ac9aa9e5a3759403bb83cd8173"} Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.440129 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.440234 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.441541 4717 patch_prober.go:28] interesting pod/console-f9d7485db-fbqb2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.441600 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-fbqb2" podUID="d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.453049 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" podStartSLOduration=130.453021752 podStartE2EDuration="2m10.453021752s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:49.452300122 +0000 UTC m=+155.868140618" watchObservedRunningTime="2026-02-17 14:54:49.453021752 +0000 UTC m=+155.868862248" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.471220 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-utilities\") pod \"redhat-marketplace-p2gw7\" (UID: \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\") " pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.471329 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2g2p\" (UniqueName: \"kubernetes.io/projected/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-kube-api-access-q2g2p\") pod \"redhat-marketplace-p2gw7\" (UID: \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\") " pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.471374 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-catalog-content\") pod \"redhat-marketplace-p2gw7\" (UID: \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\") " pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.492257 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-kdfsw" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.499844 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.499790305 podStartE2EDuration="3.499790305s" podCreationTimestamp="2026-02-17 14:54:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:49.496603815 +0000 UTC m=+155.912444311" watchObservedRunningTime="2026-02-17 14:54:49.499790305 +0000 UTC m=+155.915630781" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.568470 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.573979 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-utilities\") pod \"redhat-marketplace-p2gw7\" (UID: \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\") " pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.574127 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2g2p\" (UniqueName: \"kubernetes.io/projected/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-kube-api-access-q2g2p\") pod \"redhat-marketplace-p2gw7\" (UID: \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\") " pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.574158 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-catalog-content\") pod \"redhat-marketplace-p2gw7\" (UID: \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\") " pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.574610 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-catalog-content\") pod \"redhat-marketplace-p2gw7\" (UID: \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\") " pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.579063 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-utilities\") pod \"redhat-marketplace-p2gw7\" (UID: \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\") " pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.580074 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.638324 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2g2p\" (UniqueName: \"kubernetes.io/projected/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-kube-api-access-q2g2p\") pod \"redhat-marketplace-p2gw7\" (UID: \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\") " pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.654590 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.674247 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zxwxs" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.726499 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.771574 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zgc8l"] Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.776697 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.804128 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.805468 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.808316 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.811515 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.817931 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zgc8l"] Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.825547 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.842915 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.874042 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.881066 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6kg8\" (UniqueName: \"kubernetes.io/projected/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-kube-api-access-r6kg8\") pod \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.881294 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-config-volume\") pod \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.881394 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-secret-volume\") pod \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\" (UID: \"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a\") " Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.881666 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61fce349-1321-4c5c-9399-e1b305f505e9-catalog-content\") pod \"redhat-marketplace-zgc8l\" (UID: \"61fce349-1321-4c5c-9399-e1b305f505e9\") " pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.881819 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61fce349-1321-4c5c-9399-e1b305f505e9-utilities\") pod \"redhat-marketplace-zgc8l\" (UID: \"61fce349-1321-4c5c-9399-e1b305f505e9\") " pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.881966 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwb5q\" (UniqueName: \"kubernetes.io/projected/61fce349-1321-4c5c-9399-e1b305f505e9-kube-api-access-xwb5q\") pod \"redhat-marketplace-zgc8l\" (UID: \"61fce349-1321-4c5c-9399-e1b305f505e9\") " pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.882124 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90b34808-4724-4031-b171-9172d2cd0cd5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"90b34808-4724-4031-b171-9172d2cd0cd5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.882205 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90b34808-4724-4031-b171-9172d2cd0cd5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"90b34808-4724-4031-b171-9172d2cd0cd5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.884145 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-config-volume" (OuterVolumeSpecName: "config-volume") pod "d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a" (UID: "d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.897660 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-kube-api-access-r6kg8" (OuterVolumeSpecName: "kube-api-access-r6kg8") pod "d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a" (UID: "d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a"). InnerVolumeSpecName "kube-api-access-r6kg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.899834 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a" (UID: "d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.983314 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90b34808-4724-4031-b171-9172d2cd0cd5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"90b34808-4724-4031-b171-9172d2cd0cd5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.983787 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90b34808-4724-4031-b171-9172d2cd0cd5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"90b34808-4724-4031-b171-9172d2cd0cd5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.983866 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61fce349-1321-4c5c-9399-e1b305f505e9-catalog-content\") pod \"redhat-marketplace-zgc8l\" (UID: \"61fce349-1321-4c5c-9399-e1b305f505e9\") " pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.983936 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61fce349-1321-4c5c-9399-e1b305f505e9-utilities\") pod \"redhat-marketplace-zgc8l\" (UID: \"61fce349-1321-4c5c-9399-e1b305f505e9\") " pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.983963 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwb5q\" (UniqueName: \"kubernetes.io/projected/61fce349-1321-4c5c-9399-e1b305f505e9-kube-api-access-xwb5q\") pod \"redhat-marketplace-zgc8l\" (UID: \"61fce349-1321-4c5c-9399-e1b305f505e9\") " pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.984007 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6kg8\" (UniqueName: \"kubernetes.io/projected/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-kube-api-access-r6kg8\") on node \"crc\" DevicePath \"\"" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.984018 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.984027 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.983555 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90b34808-4724-4031-b171-9172d2cd0cd5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"90b34808-4724-4031-b171-9172d2cd0cd5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.985047 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61fce349-1321-4c5c-9399-e1b305f505e9-catalog-content\") pod \"redhat-marketplace-zgc8l\" (UID: \"61fce349-1321-4c5c-9399-e1b305f505e9\") " pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:54:49 crc kubenswrapper[4717]: I0217 14:54:49.985728 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61fce349-1321-4c5c-9399-e1b305f505e9-utilities\") pod \"redhat-marketplace-zgc8l\" (UID: \"61fce349-1321-4c5c-9399-e1b305f505e9\") " pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.002122 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwb5q\" (UniqueName: \"kubernetes.io/projected/61fce349-1321-4c5c-9399-e1b305f505e9-kube-api-access-xwb5q\") pod \"redhat-marketplace-zgc8l\" (UID: \"61fce349-1321-4c5c-9399-e1b305f505e9\") " pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.002899 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90b34808-4724-4031-b171-9172d2cd0cd5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"90b34808-4724-4031-b171-9172d2cd0cd5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.100134 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2gw7"] Feb 17 14:54:50 crc kubenswrapper[4717]: W0217 14:54:50.108187 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf91b518f_7204_4a6b_bcfb_bc763ab14d2b.slice/crio-5bbf2495e9f9f8da4910ea349ad4c74df30e1393f3c217d7eb89f63aa7d9d1aa WatchSource:0}: Error finding container 5bbf2495e9f9f8da4910ea349ad4c74df30e1393f3c217d7eb89f63aa7d9d1aa: Status 404 returned error can't find the container with id 5bbf2495e9f9f8da4910ea349ad4c74df30e1393f3c217d7eb89f63aa7d9d1aa Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.186515 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.249960 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.442169 4717 generic.go:334] "Generic (PLEG): container finished" podID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" containerID="c3cae9a9178cf8ee9684b2d34fc4ba2ad8993b4376b8dbbda96c0627f8ec3c25" exitCode=0 Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.442320 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2gw7" event={"ID":"f91b518f-7204-4a6b-bcfb-bc763ab14d2b","Type":"ContainerDied","Data":"c3cae9a9178cf8ee9684b2d34fc4ba2ad8993b4376b8dbbda96c0627f8ec3c25"} Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.442383 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2gw7" event={"ID":"f91b518f-7204-4a6b-bcfb-bc763ab14d2b","Type":"ContainerStarted","Data":"5bbf2495e9f9f8da4910ea349ad4c74df30e1393f3c217d7eb89f63aa7d9d1aa"} Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.448450 4717 generic.go:334] "Generic (PLEG): container finished" podID="543642a2-835c-42b9-8912-3e95230fafa4" containerID="40d0741e16e41d0705650290b06b4ffbbbc76cde2ea2edae9689d0a6c0c81ac7" exitCode=0 Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.448685 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"543642a2-835c-42b9-8912-3e95230fafa4","Type":"ContainerDied","Data":"40d0741e16e41d0705650290b06b4ffbbbc76cde2ea2edae9689d0a6c0c81ac7"} Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.460843 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" event={"ID":"d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a","Type":"ContainerDied","Data":"b79a7613c3f0f14618c91ed84b16302e920a0301613f310c13c15ea057f45dcc"} Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.460927 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b79a7613c3f0f14618c91ed84b16302e920a0301613f310c13c15ea057f45dcc" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.461064 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.481573 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jp5jl" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.552164 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dsx5v"] Feb 17 14:54:50 crc kubenswrapper[4717]: E0217 14:54:50.554274 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a" containerName="collect-profiles" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.554308 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a" containerName="collect-profiles" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.556328 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a" containerName="collect-profiles" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.557463 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.560486 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.564206 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dsx5v"] Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.690195 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zgc8l"] Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.696234 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c61c5e17-781e-42f4-852d-0e0604721b86-utilities\") pod \"redhat-operators-dsx5v\" (UID: \"c61c5e17-781e-42f4-852d-0e0604721b86\") " pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.696336 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c61c5e17-781e-42f4-852d-0e0604721b86-catalog-content\") pod \"redhat-operators-dsx5v\" (UID: \"c61c5e17-781e-42f4-852d-0e0604721b86\") " pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.696367 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54mqj\" (UniqueName: \"kubernetes.io/projected/c61c5e17-781e-42f4-852d-0e0604721b86-kube-api-access-54mqj\") pod \"redhat-operators-dsx5v\" (UID: \"c61c5e17-781e-42f4-852d-0e0604721b86\") " pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:54:50 crc kubenswrapper[4717]: W0217 14:54:50.704893 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61fce349_1321_4c5c_9399_e1b305f505e9.slice/crio-74b444756019b44d5c8c17605d89c6f0b84c3ba7d2206c4672c56715d7df5751 WatchSource:0}: Error finding container 74b444756019b44d5c8c17605d89c6f0b84c3ba7d2206c4672c56715d7df5751: Status 404 returned error can't find the container with id 74b444756019b44d5c8c17605d89c6f0b84c3ba7d2206c4672c56715d7df5751 Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.801911 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c61c5e17-781e-42f4-852d-0e0604721b86-catalog-content\") pod \"redhat-operators-dsx5v\" (UID: \"c61c5e17-781e-42f4-852d-0e0604721b86\") " pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.802150 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54mqj\" (UniqueName: \"kubernetes.io/projected/c61c5e17-781e-42f4-852d-0e0604721b86-kube-api-access-54mqj\") pod \"redhat-operators-dsx5v\" (UID: \"c61c5e17-781e-42f4-852d-0e0604721b86\") " pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.802324 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c61c5e17-781e-42f4-852d-0e0604721b86-utilities\") pod \"redhat-operators-dsx5v\" (UID: \"c61c5e17-781e-42f4-852d-0e0604721b86\") " pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.803148 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c61c5e17-781e-42f4-852d-0e0604721b86-utilities\") pod \"redhat-operators-dsx5v\" (UID: \"c61c5e17-781e-42f4-852d-0e0604721b86\") " pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.804465 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c61c5e17-781e-42f4-852d-0e0604721b86-catalog-content\") pod \"redhat-operators-dsx5v\" (UID: \"c61c5e17-781e-42f4-852d-0e0604721b86\") " pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.808979 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.809068 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.814026 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.865254 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54mqj\" (UniqueName: \"kubernetes.io/projected/c61c5e17-781e-42f4-852d-0e0604721b86-kube-api-access-54mqj\") pod \"redhat-operators-dsx5v\" (UID: \"c61c5e17-781e-42f4-852d-0e0604721b86\") " pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.883901 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.955321 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qlcvs"] Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.956965 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:54:50 crc kubenswrapper[4717]: I0217 14:54:50.990765 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qlcvs"] Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.009539 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bed5698-7599-46f7-9fbe-44bd48e5a185-catalog-content\") pod \"redhat-operators-qlcvs\" (UID: \"9bed5698-7599-46f7-9fbe-44bd48e5a185\") " pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.009683 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ldh8\" (UniqueName: \"kubernetes.io/projected/9bed5698-7599-46f7-9fbe-44bd48e5a185-kube-api-access-9ldh8\") pod \"redhat-operators-qlcvs\" (UID: \"9bed5698-7599-46f7-9fbe-44bd48e5a185\") " pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.009818 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bed5698-7599-46f7-9fbe-44bd48e5a185-utilities\") pod \"redhat-operators-qlcvs\" (UID: \"9bed5698-7599-46f7-9fbe-44bd48e5a185\") " pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.111523 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bed5698-7599-46f7-9fbe-44bd48e5a185-catalog-content\") pod \"redhat-operators-qlcvs\" (UID: \"9bed5698-7599-46f7-9fbe-44bd48e5a185\") " pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.111582 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ldh8\" (UniqueName: \"kubernetes.io/projected/9bed5698-7599-46f7-9fbe-44bd48e5a185-kube-api-access-9ldh8\") pod \"redhat-operators-qlcvs\" (UID: \"9bed5698-7599-46f7-9fbe-44bd48e5a185\") " pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.111618 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bed5698-7599-46f7-9fbe-44bd48e5a185-utilities\") pod \"redhat-operators-qlcvs\" (UID: \"9bed5698-7599-46f7-9fbe-44bd48e5a185\") " pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.112194 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bed5698-7599-46f7-9fbe-44bd48e5a185-utilities\") pod \"redhat-operators-qlcvs\" (UID: \"9bed5698-7599-46f7-9fbe-44bd48e5a185\") " pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.112436 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bed5698-7599-46f7-9fbe-44bd48e5a185-catalog-content\") pod \"redhat-operators-qlcvs\" (UID: \"9bed5698-7599-46f7-9fbe-44bd48e5a185\") " pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.142593 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ldh8\" (UniqueName: \"kubernetes.io/projected/9bed5698-7599-46f7-9fbe-44bd48e5a185-kube-api-access-9ldh8\") pod \"redhat-operators-qlcvs\" (UID: \"9bed5698-7599-46f7-9fbe-44bd48e5a185\") " pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.325460 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.366829 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dsx5v"] Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.904840 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"90b34808-4724-4031-b171-9172d2cd0cd5","Type":"ContainerStarted","Data":"2f3e96c405ac03eb0a13639b210f74ad9fc50c6d669907193cc679e4b0607a05"} Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.924606 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dsx5v" event={"ID":"c61c5e17-781e-42f4-852d-0e0604721b86","Type":"ContainerStarted","Data":"be0f9358af7ff3511338d1db489ab7079287a831b3314b6687cc8bf630813b83"} Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.948594 4717 generic.go:334] "Generic (PLEG): container finished" podID="61fce349-1321-4c5c-9399-e1b305f505e9" containerID="1ae9fdbb731c2953e56022220c346e672a98c50f0462a9d51a6b7bbcec0007b5" exitCode=0 Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.948917 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zgc8l" event={"ID":"61fce349-1321-4c5c-9399-e1b305f505e9","Type":"ContainerDied","Data":"1ae9fdbb731c2953e56022220c346e672a98c50f0462a9d51a6b7bbcec0007b5"} Feb 17 14:54:51 crc kubenswrapper[4717]: I0217 14:54:51.948996 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zgc8l" event={"ID":"61fce349-1321-4c5c-9399-e1b305f505e9","Type":"ContainerStarted","Data":"74b444756019b44d5c8c17605d89c6f0b84c3ba7d2206c4672c56715d7df5751"} Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.313331 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qlcvs"] Feb 17 14:54:52 crc kubenswrapper[4717]: W0217 14:54:52.343762 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bed5698_7599_46f7_9fbe_44bd48e5a185.slice/crio-773f1e89167e261ad0d8fc5a9a8a4693a55314c3524cca9e23fa2c3fa699b7b9 WatchSource:0}: Error finding container 773f1e89167e261ad0d8fc5a9a8a4693a55314c3524cca9e23fa2c3fa699b7b9: Status 404 returned error can't find the container with id 773f1e89167e261ad0d8fc5a9a8a4693a55314c3524cca9e23fa2c3fa699b7b9 Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.394639 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.579628 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/543642a2-835c-42b9-8912-3e95230fafa4-kube-api-access\") pod \"543642a2-835c-42b9-8912-3e95230fafa4\" (UID: \"543642a2-835c-42b9-8912-3e95230fafa4\") " Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.579712 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/543642a2-835c-42b9-8912-3e95230fafa4-kubelet-dir\") pod \"543642a2-835c-42b9-8912-3e95230fafa4\" (UID: \"543642a2-835c-42b9-8912-3e95230fafa4\") " Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.580105 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/543642a2-835c-42b9-8912-3e95230fafa4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "543642a2-835c-42b9-8912-3e95230fafa4" (UID: "543642a2-835c-42b9-8912-3e95230fafa4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.591621 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543642a2-835c-42b9-8912-3e95230fafa4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "543642a2-835c-42b9-8912-3e95230fafa4" (UID: "543642a2-835c-42b9-8912-3e95230fafa4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.689658 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/543642a2-835c-42b9-8912-3e95230fafa4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.689701 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/543642a2-835c-42b9-8912-3e95230fafa4-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.969657 4717 generic.go:334] "Generic (PLEG): container finished" podID="c61c5e17-781e-42f4-852d-0e0604721b86" containerID="d80d0309d05208bda67a874cabca4a6b6dbabba774d30fe7d9d64c2791079467" exitCode=0 Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.969733 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dsx5v" event={"ID":"c61c5e17-781e-42f4-852d-0e0604721b86","Type":"ContainerDied","Data":"d80d0309d05208bda67a874cabca4a6b6dbabba774d30fe7d9d64c2791079467"} Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.984634 4717 generic.go:334] "Generic (PLEG): container finished" podID="9bed5698-7599-46f7-9fbe-44bd48e5a185" containerID="98575684feef993a86f0e968216d45dc069aa5b15dcc272f7039c510a6844a4e" exitCode=0 Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.984815 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlcvs" event={"ID":"9bed5698-7599-46f7-9fbe-44bd48e5a185","Type":"ContainerDied","Data":"98575684feef993a86f0e968216d45dc069aa5b15dcc272f7039c510a6844a4e"} Feb 17 14:54:52 crc kubenswrapper[4717]: I0217 14:54:52.984851 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlcvs" event={"ID":"9bed5698-7599-46f7-9fbe-44bd48e5a185","Type":"ContainerStarted","Data":"773f1e89167e261ad0d8fc5a9a8a4693a55314c3524cca9e23fa2c3fa699b7b9"} Feb 17 14:54:53 crc kubenswrapper[4717]: I0217 14:54:53.004768 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"543642a2-835c-42b9-8912-3e95230fafa4","Type":"ContainerDied","Data":"0332c4d90e598b79c8ed22383b03122fe791fca4ad8f468d4dcc8ef124bd1074"} Feb 17 14:54:53 crc kubenswrapper[4717]: I0217 14:54:53.004818 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0332c4d90e598b79c8ed22383b03122fe791fca4ad8f468d4dcc8ef124bd1074" Feb 17 14:54:53 crc kubenswrapper[4717]: I0217 14:54:53.004822 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:54:53 crc kubenswrapper[4717]: I0217 14:54:53.007966 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"90b34808-4724-4031-b171-9172d2cd0cd5","Type":"ContainerStarted","Data":"251d6e92550ae9207dd34f932c60be2a3a9c4f0cb3e67cd5b375f6db9fe84588"} Feb 17 14:54:53 crc kubenswrapper[4717]: I0217 14:54:53.050782 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.05075206 podStartE2EDuration="4.05075206s" podCreationTimestamp="2026-02-17 14:54:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:54:53.038066121 +0000 UTC m=+159.453906597" watchObservedRunningTime="2026-02-17 14:54:53.05075206 +0000 UTC m=+159.466592536" Feb 17 14:54:54 crc kubenswrapper[4717]: I0217 14:54:54.031373 4717 generic.go:334] "Generic (PLEG): container finished" podID="90b34808-4724-4031-b171-9172d2cd0cd5" containerID="251d6e92550ae9207dd34f932c60be2a3a9c4f0cb3e67cd5b375f6db9fe84588" exitCode=0 Feb 17 14:54:54 crc kubenswrapper[4717]: I0217 14:54:54.031435 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"90b34808-4724-4031-b171-9172d2cd0cd5","Type":"ContainerDied","Data":"251d6e92550ae9207dd34f932c60be2a3a9c4f0cb3e67cd5b375f6db9fe84588"} Feb 17 14:54:55 crc kubenswrapper[4717]: I0217 14:54:54.997048 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-whb99" Feb 17 14:54:55 crc kubenswrapper[4717]: I0217 14:54:55.891002 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:54:56 crc kubenswrapper[4717]: I0217 14:54:56.049105 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90b34808-4724-4031-b171-9172d2cd0cd5-kubelet-dir\") pod \"90b34808-4724-4031-b171-9172d2cd0cd5\" (UID: \"90b34808-4724-4031-b171-9172d2cd0cd5\") " Feb 17 14:54:56 crc kubenswrapper[4717]: I0217 14:54:56.049206 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90b34808-4724-4031-b171-9172d2cd0cd5-kube-api-access\") pod \"90b34808-4724-4031-b171-9172d2cd0cd5\" (UID: \"90b34808-4724-4031-b171-9172d2cd0cd5\") " Feb 17 14:54:56 crc kubenswrapper[4717]: I0217 14:54:56.049214 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90b34808-4724-4031-b171-9172d2cd0cd5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "90b34808-4724-4031-b171-9172d2cd0cd5" (UID: "90b34808-4724-4031-b171-9172d2cd0cd5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:54:56 crc kubenswrapper[4717]: I0217 14:54:56.049520 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90b34808-4724-4031-b171-9172d2cd0cd5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:54:56 crc kubenswrapper[4717]: I0217 14:54:56.062163 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b34808-4724-4031-b171-9172d2cd0cd5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "90b34808-4724-4031-b171-9172d2cd0cd5" (UID: "90b34808-4724-4031-b171-9172d2cd0cd5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:54:56 crc kubenswrapper[4717]: I0217 14:54:56.072730 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"90b34808-4724-4031-b171-9172d2cd0cd5","Type":"ContainerDied","Data":"2f3e96c405ac03eb0a13639b210f74ad9fc50c6d669907193cc679e4b0607a05"} Feb 17 14:54:56 crc kubenswrapper[4717]: I0217 14:54:56.072774 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f3e96c405ac03eb0a13639b210f74ad9fc50c6d669907193cc679e4b0607a05" Feb 17 14:54:56 crc kubenswrapper[4717]: I0217 14:54:56.072793 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:54:56 crc kubenswrapper[4717]: I0217 14:54:56.150919 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90b34808-4724-4031-b171-9172d2cd0cd5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:54:59 crc kubenswrapper[4717]: I0217 14:54:59.156010 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-r2gmn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 17 14:54:59 crc kubenswrapper[4717]: I0217 14:54:59.156746 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r2gmn" podUID="5aed9e9a-0eff-476d-bf5c-e268f4e16e06" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 17 14:54:59 crc kubenswrapper[4717]: I0217 14:54:59.156165 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-r2gmn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 17 14:54:59 crc kubenswrapper[4717]: I0217 14:54:59.156857 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-r2gmn" podUID="5aed9e9a-0eff-476d-bf5c-e268f4e16e06" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 17 14:54:59 crc kubenswrapper[4717]: I0217 14:54:59.453973 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:54:59 crc kubenswrapper[4717]: I0217 14:54:59.457885 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 14:55:01 crc kubenswrapper[4717]: I0217 14:55:01.883044 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:55:01 crc kubenswrapper[4717]: I0217 14:55:01.893052 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f31d30c1-1e4a-49d3-adef-767a88616f33-metrics-certs\") pod \"network-metrics-daemon-pzb78\" (UID: \"f31d30c1-1e4a-49d3-adef-767a88616f33\") " pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:55:02 crc kubenswrapper[4717]: I0217 14:55:02.190255 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pzb78" Feb 17 14:55:08 crc kubenswrapper[4717]: I0217 14:55:08.379268 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:55:09 crc kubenswrapper[4717]: I0217 14:55:09.159765 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-r2gmn" Feb 17 14:55:19 crc kubenswrapper[4717]: I0217 14:55:19.564939 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rk4xx" Feb 17 14:55:20 crc kubenswrapper[4717]: I0217 14:55:20.808465 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:55:20 crc kubenswrapper[4717]: I0217 14:55:20.808862 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:55:25 crc kubenswrapper[4717]: E0217 14:55:25.271075 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 14:55:25 crc kubenswrapper[4717]: E0217 14:55:25.271866 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-54mqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dsx5v_openshift-marketplace(c61c5e17-781e-42f4-852d-0e0604721b86): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:55:25 crc kubenswrapper[4717]: E0217 14:55:25.273272 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dsx5v" podUID="c61c5e17-781e-42f4-852d-0e0604721b86" Feb 17 14:55:25 crc kubenswrapper[4717]: I0217 14:55:25.989921 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:55:26 crc kubenswrapper[4717]: E0217 14:55:26.874687 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dsx5v" podUID="c61c5e17-781e-42f4-852d-0e0604721b86" Feb 17 14:55:26 crc kubenswrapper[4717]: E0217 14:55:26.982014 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 17 14:55:26 crc kubenswrapper[4717]: E0217 14:55:26.982486 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-946vm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-r7hv9_openshift-marketplace(2c492899-c1df-4839-b6db-18634da0dbb6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:55:26 crc kubenswrapper[4717]: E0217 14:55:26.983713 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-r7hv9" podUID="2c492899-c1df-4839-b6db-18634da0dbb6" Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.003796 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.004032 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xwb5q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zgc8l_openshift-marketplace(61fce349-1321-4c5c-9399-e1b305f505e9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.005381 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zgc8l" podUID="61fce349-1321-4c5c-9399-e1b305f505e9" Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.044629 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.045438 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9ldh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qlcvs_openshift-marketplace(9bed5698-7599-46f7-9fbe-44bd48e5a185): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.046632 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qlcvs" podUID="9bed5698-7599-46f7-9fbe-44bd48e5a185" Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.112256 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.112471 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q2g2p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-p2gw7_openshift-marketplace(f91b518f-7204-4a6b-bcfb-bc763ab14d2b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.113839 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-p2gw7" podUID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" Feb 17 14:55:27 crc kubenswrapper[4717]: I0217 14:55:27.312719 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbvbf" event={"ID":"0bae9eeb-1b53-44fd-9751-7b03463ceaf9","Type":"ContainerStarted","Data":"72acb4aa00269a342bc4ea255ca0b24ea4e9ec8baaecc368b8d8478d99400cef"} Feb 17 14:55:27 crc kubenswrapper[4717]: I0217 14:55:27.315990 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-49dn7" event={"ID":"59890efa-3bd7-474e-b962-99c705159847","Type":"ContainerStarted","Data":"22df71ad461663447fabe0d2b96aaadc8f48852086feb5a62c82e67ac41ff3e3"} Feb 17 14:55:27 crc kubenswrapper[4717]: I0217 14:55:27.319130 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lqj9" event={"ID":"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5","Type":"ContainerStarted","Data":"09dec7e40ae0601f2d60dfbf544a8e6338b8b197e2905d5d9d8b6d087e864f6c"} Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.321630 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zgc8l" podUID="61fce349-1321-4c5c-9399-e1b305f505e9" Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.322288 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-p2gw7" podUID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.322347 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qlcvs" podUID="9bed5698-7599-46f7-9fbe-44bd48e5a185" Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.325430 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-r7hv9" podUID="2c492899-c1df-4839-b6db-18634da0dbb6" Feb 17 14:55:27 crc kubenswrapper[4717]: I0217 14:55:27.433130 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pzb78"] Feb 17 14:55:27 crc kubenswrapper[4717]: E0217 14:55:27.630861 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59890efa_3bd7_474e_b962_99c705159847.slice/crio-22df71ad461663447fabe0d2b96aaadc8f48852086feb5a62c82e67ac41ff3e3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bae9eeb_1b53_44fd_9751_7b03463ceaf9.slice/crio-conmon-72acb4aa00269a342bc4ea255ca0b24ea4e9ec8baaecc368b8d8478d99400cef.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.326821 4717 generic.go:334] "Generic (PLEG): container finished" podID="c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" containerID="09dec7e40ae0601f2d60dfbf544a8e6338b8b197e2905d5d9d8b6d087e864f6c" exitCode=0 Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.326891 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lqj9" event={"ID":"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5","Type":"ContainerDied","Data":"09dec7e40ae0601f2d60dfbf544a8e6338b8b197e2905d5d9d8b6d087e864f6c"} Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.332597 4717 generic.go:334] "Generic (PLEG): container finished" podID="0bae9eeb-1b53-44fd-9751-7b03463ceaf9" containerID="72acb4aa00269a342bc4ea255ca0b24ea4e9ec8baaecc368b8d8478d99400cef" exitCode=0 Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.332676 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbvbf" event={"ID":"0bae9eeb-1b53-44fd-9751-7b03463ceaf9","Type":"ContainerDied","Data":"72acb4aa00269a342bc4ea255ca0b24ea4e9ec8baaecc368b8d8478d99400cef"} Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.341397 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pzb78" event={"ID":"f31d30c1-1e4a-49d3-adef-767a88616f33","Type":"ContainerStarted","Data":"30c90ece7f06fe3fb8554fda11158237f4dc0d0e5fa02f0111d4a52b22ffb27d"} Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.341449 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pzb78" event={"ID":"f31d30c1-1e4a-49d3-adef-767a88616f33","Type":"ContainerStarted","Data":"592f9211cabb43ba8472187a042a35eab09b7ffe9bbd638b454c392fd3a451c0"} Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.341461 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pzb78" event={"ID":"f31d30c1-1e4a-49d3-adef-767a88616f33","Type":"ContainerStarted","Data":"04bb49c7ea4a6641c0c3c6c8e0908a8bf961b7eb155b54c675157df5fd4de925"} Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.347726 4717 generic.go:334] "Generic (PLEG): container finished" podID="59890efa-3bd7-474e-b962-99c705159847" containerID="22df71ad461663447fabe0d2b96aaadc8f48852086feb5a62c82e67ac41ff3e3" exitCode=0 Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.347790 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-49dn7" event={"ID":"59890efa-3bd7-474e-b962-99c705159847","Type":"ContainerDied","Data":"22df71ad461663447fabe0d2b96aaadc8f48852086feb5a62c82e67ac41ff3e3"} Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.369103 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pzb78" podStartSLOduration=169.369063245 podStartE2EDuration="2m49.369063245s" podCreationTimestamp="2026-02-17 14:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:55:28.364386663 +0000 UTC m=+194.780227139" watchObservedRunningTime="2026-02-17 14:55:28.369063245 +0000 UTC m=+194.784903721" Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.783608 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 14:55:28 crc kubenswrapper[4717]: E0217 14:55:28.784439 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b34808-4724-4031-b171-9172d2cd0cd5" containerName="pruner" Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.784468 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b34808-4724-4031-b171-9172d2cd0cd5" containerName="pruner" Feb 17 14:55:28 crc kubenswrapper[4717]: E0217 14:55:28.784489 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543642a2-835c-42b9-8912-3e95230fafa4" containerName="pruner" Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.784501 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="543642a2-835c-42b9-8912-3e95230fafa4" containerName="pruner" Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.784654 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="543642a2-835c-42b9-8912-3e95230fafa4" containerName="pruner" Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.784688 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b34808-4724-4031-b171-9172d2cd0cd5" containerName="pruner" Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.785212 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.788069 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.790984 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.794895 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.951869 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e24e53e5-5f28-4d7d-94a1-e7d49f539b49-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e24e53e5-5f28-4d7d-94a1-e7d49f539b49\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:55:28 crc kubenswrapper[4717]: I0217 14:55:28.951980 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e24e53e5-5f28-4d7d-94a1-e7d49f539b49-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e24e53e5-5f28-4d7d-94a1-e7d49f539b49\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:55:29 crc kubenswrapper[4717]: I0217 14:55:29.053895 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e24e53e5-5f28-4d7d-94a1-e7d49f539b49-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e24e53e5-5f28-4d7d-94a1-e7d49f539b49\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:55:29 crc kubenswrapper[4717]: I0217 14:55:29.054115 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e24e53e5-5f28-4d7d-94a1-e7d49f539b49-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e24e53e5-5f28-4d7d-94a1-e7d49f539b49\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:55:29 crc kubenswrapper[4717]: I0217 14:55:29.054195 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e24e53e5-5f28-4d7d-94a1-e7d49f539b49-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e24e53e5-5f28-4d7d-94a1-e7d49f539b49\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:55:29 crc kubenswrapper[4717]: I0217 14:55:29.080169 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e24e53e5-5f28-4d7d-94a1-e7d49f539b49-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e24e53e5-5f28-4d7d-94a1-e7d49f539b49\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:55:29 crc kubenswrapper[4717]: I0217 14:55:29.104524 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:55:29 crc kubenswrapper[4717]: I0217 14:55:29.356634 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lqj9" event={"ID":"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5","Type":"ContainerStarted","Data":"e83c65743bb5b3310097b5c9be572645ad59aa90ba35a166232ac447ed88ad1a"} Feb 17 14:55:29 crc kubenswrapper[4717]: I0217 14:55:29.359486 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbvbf" event={"ID":"0bae9eeb-1b53-44fd-9751-7b03463ceaf9","Type":"ContainerStarted","Data":"060b9fdcf6947089e74c80da0d707e01f5e25987f36c129028236dea4049fed4"} Feb 17 14:55:29 crc kubenswrapper[4717]: I0217 14:55:29.367653 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-49dn7" event={"ID":"59890efa-3bd7-474e-b962-99c705159847","Type":"ContainerStarted","Data":"6d91d88403cab3e8ff653d0d0d873034df34c990b6022b519a5cec16d6230ce8"} Feb 17 14:55:29 crc kubenswrapper[4717]: I0217 14:55:29.392375 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6lqj9" podStartSLOduration=3.043149493 podStartE2EDuration="42.392355555s" podCreationTimestamp="2026-02-17 14:54:47 +0000 UTC" firstStartedPulling="2026-02-17 14:54:49.407231826 +0000 UTC m=+155.823072302" lastFinishedPulling="2026-02-17 14:55:28.756437888 +0000 UTC m=+195.172278364" observedRunningTime="2026-02-17 14:55:29.377611148 +0000 UTC m=+195.793451654" watchObservedRunningTime="2026-02-17 14:55:29.392355555 +0000 UTC m=+195.808196031" Feb 17 14:55:29 crc kubenswrapper[4717]: I0217 14:55:29.406115 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-49dn7" podStartSLOduration=2.893638941 podStartE2EDuration="42.406066893s" podCreationTimestamp="2026-02-17 14:54:47 +0000 UTC" firstStartedPulling="2026-02-17 14:54:49.384174123 +0000 UTC m=+155.800014589" lastFinishedPulling="2026-02-17 14:55:28.896602065 +0000 UTC m=+195.312442541" observedRunningTime="2026-02-17 14:55:29.403170671 +0000 UTC m=+195.819011157" watchObservedRunningTime="2026-02-17 14:55:29.406066893 +0000 UTC m=+195.821907369" Feb 17 14:55:29 crc kubenswrapper[4717]: I0217 14:55:29.425295 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tbvbf" podStartSLOduration=3.001047852 podStartE2EDuration="42.425278897s" podCreationTimestamp="2026-02-17 14:54:47 +0000 UTC" firstStartedPulling="2026-02-17 14:54:49.415328935 +0000 UTC m=+155.831169411" lastFinishedPulling="2026-02-17 14:55:28.83955998 +0000 UTC m=+195.255400456" observedRunningTime="2026-02-17 14:55:29.423410634 +0000 UTC m=+195.839251120" watchObservedRunningTime="2026-02-17 14:55:29.425278897 +0000 UTC m=+195.841119373" Feb 17 14:55:29 crc kubenswrapper[4717]: I0217 14:55:29.536919 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 14:55:30 crc kubenswrapper[4717]: I0217 14:55:30.375035 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e24e53e5-5f28-4d7d-94a1-e7d49f539b49","Type":"ContainerStarted","Data":"9c84fcf8e90bf1f2356093b9347c88ab62b48ea8ad4711360819d384b3703905"} Feb 17 14:55:30 crc kubenswrapper[4717]: I0217 14:55:30.375660 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e24e53e5-5f28-4d7d-94a1-e7d49f539b49","Type":"ContainerStarted","Data":"289d83d3b1228bd0404a02d0c70788bb21c872502f63e340ca58ee8f44f6c8b4"} Feb 17 14:55:30 crc kubenswrapper[4717]: I0217 14:55:30.393800 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.393755173 podStartE2EDuration="2.393755173s" podCreationTimestamp="2026-02-17 14:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:55:30.391949157 +0000 UTC m=+196.807789633" watchObservedRunningTime="2026-02-17 14:55:30.393755173 +0000 UTC m=+196.809595639" Feb 17 14:55:31 crc kubenswrapper[4717]: I0217 14:55:31.382267 4717 generic.go:334] "Generic (PLEG): container finished" podID="e24e53e5-5f28-4d7d-94a1-e7d49f539b49" containerID="9c84fcf8e90bf1f2356093b9347c88ab62b48ea8ad4711360819d384b3703905" exitCode=0 Feb 17 14:55:31 crc kubenswrapper[4717]: I0217 14:55:31.382403 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e24e53e5-5f28-4d7d-94a1-e7d49f539b49","Type":"ContainerDied","Data":"9c84fcf8e90bf1f2356093b9347c88ab62b48ea8ad4711360819d384b3703905"} Feb 17 14:55:32 crc kubenswrapper[4717]: I0217 14:55:32.663288 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:55:32 crc kubenswrapper[4717]: I0217 14:55:32.825610 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e24e53e5-5f28-4d7d-94a1-e7d49f539b49-kube-api-access\") pod \"e24e53e5-5f28-4d7d-94a1-e7d49f539b49\" (UID: \"e24e53e5-5f28-4d7d-94a1-e7d49f539b49\") " Feb 17 14:55:32 crc kubenswrapper[4717]: I0217 14:55:32.825775 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e24e53e5-5f28-4d7d-94a1-e7d49f539b49-kubelet-dir\") pod \"e24e53e5-5f28-4d7d-94a1-e7d49f539b49\" (UID: \"e24e53e5-5f28-4d7d-94a1-e7d49f539b49\") " Feb 17 14:55:32 crc kubenswrapper[4717]: I0217 14:55:32.825855 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e24e53e5-5f28-4d7d-94a1-e7d49f539b49-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e24e53e5-5f28-4d7d-94a1-e7d49f539b49" (UID: "e24e53e5-5f28-4d7d-94a1-e7d49f539b49"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:55:32 crc kubenswrapper[4717]: I0217 14:55:32.826170 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e24e53e5-5f28-4d7d-94a1-e7d49f539b49-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:55:32 crc kubenswrapper[4717]: I0217 14:55:32.831340 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24e53e5-5f28-4d7d-94a1-e7d49f539b49-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e24e53e5-5f28-4d7d-94a1-e7d49f539b49" (UID: "e24e53e5-5f28-4d7d-94a1-e7d49f539b49"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:55:32 crc kubenswrapper[4717]: I0217 14:55:32.931355 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e24e53e5-5f28-4d7d-94a1-e7d49f539b49-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:55:33 crc kubenswrapper[4717]: I0217 14:55:33.398051 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e24e53e5-5f28-4d7d-94a1-e7d49f539b49","Type":"ContainerDied","Data":"289d83d3b1228bd0404a02d0c70788bb21c872502f63e340ca58ee8f44f6c8b4"} Feb 17 14:55:33 crc kubenswrapper[4717]: I0217 14:55:33.398109 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="289d83d3b1228bd0404a02d0c70788bb21c872502f63e340ca58ee8f44f6c8b4" Feb 17 14:55:33 crc kubenswrapper[4717]: I0217 14:55:33.398191 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:55:34 crc kubenswrapper[4717]: I0217 14:55:34.975093 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 14:55:34 crc kubenswrapper[4717]: E0217 14:55:34.975426 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24e53e5-5f28-4d7d-94a1-e7d49f539b49" containerName="pruner" Feb 17 14:55:34 crc kubenswrapper[4717]: I0217 14:55:34.975442 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24e53e5-5f28-4d7d-94a1-e7d49f539b49" containerName="pruner" Feb 17 14:55:34 crc kubenswrapper[4717]: I0217 14:55:34.975576 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24e53e5-5f28-4d7d-94a1-e7d49f539b49" containerName="pruner" Feb 17 14:55:34 crc kubenswrapper[4717]: I0217 14:55:34.976070 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:55:34 crc kubenswrapper[4717]: I0217 14:55:34.984056 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 14:55:34 crc kubenswrapper[4717]: I0217 14:55:34.984310 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 14:55:34 crc kubenswrapper[4717]: I0217 14:55:34.989349 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 14:55:35 crc kubenswrapper[4717]: I0217 14:55:35.060154 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3f83fb-6861-461b-a7de-5968ce2089fd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5b3f83fb-6861-461b-a7de-5968ce2089fd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:55:35 crc kubenswrapper[4717]: I0217 14:55:35.060496 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5b3f83fb-6861-461b-a7de-5968ce2089fd-var-lock\") pod \"installer-9-crc\" (UID: \"5b3f83fb-6861-461b-a7de-5968ce2089fd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:55:35 crc kubenswrapper[4717]: I0217 14:55:35.060640 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3f83fb-6861-461b-a7de-5968ce2089fd-kube-api-access\") pod \"installer-9-crc\" (UID: \"5b3f83fb-6861-461b-a7de-5968ce2089fd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:55:35 crc kubenswrapper[4717]: I0217 14:55:35.162396 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3f83fb-6861-461b-a7de-5968ce2089fd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5b3f83fb-6861-461b-a7de-5968ce2089fd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:55:35 crc kubenswrapper[4717]: I0217 14:55:35.162872 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5b3f83fb-6861-461b-a7de-5968ce2089fd-var-lock\") pod \"installer-9-crc\" (UID: \"5b3f83fb-6861-461b-a7de-5968ce2089fd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:55:35 crc kubenswrapper[4717]: I0217 14:55:35.162575 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3f83fb-6861-461b-a7de-5968ce2089fd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5b3f83fb-6861-461b-a7de-5968ce2089fd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:55:35 crc kubenswrapper[4717]: I0217 14:55:35.162947 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5b3f83fb-6861-461b-a7de-5968ce2089fd-var-lock\") pod \"installer-9-crc\" (UID: \"5b3f83fb-6861-461b-a7de-5968ce2089fd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:55:35 crc kubenswrapper[4717]: I0217 14:55:35.163138 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3f83fb-6861-461b-a7de-5968ce2089fd-kube-api-access\") pod \"installer-9-crc\" (UID: \"5b3f83fb-6861-461b-a7de-5968ce2089fd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:55:35 crc kubenswrapper[4717]: I0217 14:55:35.193724 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3f83fb-6861-461b-a7de-5968ce2089fd-kube-api-access\") pod \"installer-9-crc\" (UID: \"5b3f83fb-6861-461b-a7de-5968ce2089fd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:55:35 crc kubenswrapper[4717]: I0217 14:55:35.307638 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:55:35 crc kubenswrapper[4717]: I0217 14:55:35.491927 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 14:55:35 crc kubenswrapper[4717]: W0217 14:55:35.501837 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5b3f83fb_6861_461b_a7de_5968ce2089fd.slice/crio-1fc76fcf7847b9e3f66cbca0f88df2cfcd95e8b16ce156d7f8478d002dd46987 WatchSource:0}: Error finding container 1fc76fcf7847b9e3f66cbca0f88df2cfcd95e8b16ce156d7f8478d002dd46987: Status 404 returned error can't find the container with id 1fc76fcf7847b9e3f66cbca0f88df2cfcd95e8b16ce156d7f8478d002dd46987 Feb 17 14:55:36 crc kubenswrapper[4717]: I0217 14:55:36.414881 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5b3f83fb-6861-461b-a7de-5968ce2089fd","Type":"ContainerStarted","Data":"4f245c2e9bb51be7827de3b9b8123998d802ae58bb16669ff3a3e2018c94300f"} Feb 17 14:55:36 crc kubenswrapper[4717]: I0217 14:55:36.415305 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5b3f83fb-6861-461b-a7de-5968ce2089fd","Type":"ContainerStarted","Data":"1fc76fcf7847b9e3f66cbca0f88df2cfcd95e8b16ce156d7f8478d002dd46987"} Feb 17 14:55:36 crc kubenswrapper[4717]: I0217 14:55:36.432761 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.432736321 podStartE2EDuration="2.432736321s" podCreationTimestamp="2026-02-17 14:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:55:36.429120978 +0000 UTC m=+202.844961474" watchObservedRunningTime="2026-02-17 14:55:36.432736321 +0000 UTC m=+202.848576797" Feb 17 14:55:37 crc kubenswrapper[4717]: I0217 14:55:37.729860 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:55:37 crc kubenswrapper[4717]: I0217 14:55:37.730427 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:55:37 crc kubenswrapper[4717]: I0217 14:55:37.870584 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:55:37 crc kubenswrapper[4717]: I0217 14:55:37.917234 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:55:37 crc kubenswrapper[4717]: I0217 14:55:37.917284 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:55:37 crc kubenswrapper[4717]: I0217 14:55:37.958210 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:55:38 crc kubenswrapper[4717]: I0217 14:55:38.389936 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:55:38 crc kubenswrapper[4717]: I0217 14:55:38.390118 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:55:38 crc kubenswrapper[4717]: I0217 14:55:38.438340 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:55:38 crc kubenswrapper[4717]: I0217 14:55:38.481549 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:55:38 crc kubenswrapper[4717]: I0217 14:55:38.501016 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:55:39 crc kubenswrapper[4717]: I0217 14:55:39.497964 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:55:41 crc kubenswrapper[4717]: I0217 14:55:41.459379 4717 generic.go:334] "Generic (PLEG): container finished" podID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" containerID="cbd4098d589c6a977506dcdb654896865fb61f6b4602773be1c2f8aa3f37a80e" exitCode=0 Feb 17 14:55:41 crc kubenswrapper[4717]: I0217 14:55:41.459482 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2gw7" event={"ID":"f91b518f-7204-4a6b-bcfb-bc763ab14d2b","Type":"ContainerDied","Data":"cbd4098d589c6a977506dcdb654896865fb61f6b4602773be1c2f8aa3f37a80e"} Feb 17 14:55:43 crc kubenswrapper[4717]: I0217 14:55:43.482444 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6lqj9"] Feb 17 14:55:43 crc kubenswrapper[4717]: I0217 14:55:43.483492 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6lqj9" podUID="c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" containerName="registry-server" containerID="cri-o://e83c65743bb5b3310097b5c9be572645ad59aa90ba35a166232ac447ed88ad1a" gracePeriod=2 Feb 17 14:55:44 crc kubenswrapper[4717]: I0217 14:55:44.484454 4717 generic.go:334] "Generic (PLEG): container finished" podID="c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" containerID="e83c65743bb5b3310097b5c9be572645ad59aa90ba35a166232ac447ed88ad1a" exitCode=0 Feb 17 14:55:44 crc kubenswrapper[4717]: I0217 14:55:44.484541 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lqj9" event={"ID":"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5","Type":"ContainerDied","Data":"e83c65743bb5b3310097b5c9be572645ad59aa90ba35a166232ac447ed88ad1a"} Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.475976 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.492858 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lqj9" event={"ID":"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5","Type":"ContainerDied","Data":"6f607c35489a674d2beabfeae5823313c42ea453789405281cea7a68da85baf6"} Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.492930 4717 scope.go:117] "RemoveContainer" containerID="e83c65743bb5b3310097b5c9be572645ad59aa90ba35a166232ac447ed88ad1a" Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.492964 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6lqj9" Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.634990 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-catalog-content\") pod \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\" (UID: \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\") " Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.635214 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-utilities\") pod \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\" (UID: \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\") " Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.635320 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hb54\" (UniqueName: \"kubernetes.io/projected/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-kube-api-access-8hb54\") pod \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\" (UID: \"c1816dd9-8f4c-4667-bdbf-7c6d04e325b5\") " Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.636859 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-utilities" (OuterVolumeSpecName: "utilities") pod "c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" (UID: "c1816dd9-8f4c-4667-bdbf-7c6d04e325b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.643126 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-kube-api-access-8hb54" (OuterVolumeSpecName: "kube-api-access-8hb54") pod "c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" (UID: "c1816dd9-8f4c-4667-bdbf-7c6d04e325b5"). InnerVolumeSpecName "kube-api-access-8hb54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.708223 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" (UID: "c1816dd9-8f4c-4667-bdbf-7c6d04e325b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.737118 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.737168 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hb54\" (UniqueName: \"kubernetes.io/projected/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-kube-api-access-8hb54\") on node \"crc\" DevicePath \"\"" Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.737180 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.824979 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6lqj9"] Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.831121 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6lqj9"] Feb 17 14:55:45 crc kubenswrapper[4717]: I0217 14:55:45.854714 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" path="/var/lib/kubelet/pods/c1816dd9-8f4c-4667-bdbf-7c6d04e325b5/volumes" Feb 17 14:55:46 crc kubenswrapper[4717]: I0217 14:55:46.905653 4717 scope.go:117] "RemoveContainer" containerID="09dec7e40ae0601f2d60dfbf544a8e6338b8b197e2905d5d9d8b6d087e864f6c" Feb 17 14:55:46 crc kubenswrapper[4717]: I0217 14:55:46.977176 4717 scope.go:117] "RemoveContainer" containerID="2af5529243f5f061f672834edc3b8da1d29a3608aaed98ee635a3f85867be4a5" Feb 17 14:55:47 crc kubenswrapper[4717]: I0217 14:55:47.519592 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7hv9" event={"ID":"2c492899-c1df-4839-b6db-18634da0dbb6","Type":"ContainerStarted","Data":"103ede5088aa0f24285e59439bada8d83670e95baa654c45c8f28994255c5df5"} Feb 17 14:55:47 crc kubenswrapper[4717]: I0217 14:55:47.522853 4717 generic.go:334] "Generic (PLEG): container finished" podID="61fce349-1321-4c5c-9399-e1b305f505e9" containerID="ad70588b0505bef3caea0719c43968399fce8016d113a726540d0388cf50e5e7" exitCode=0 Feb 17 14:55:47 crc kubenswrapper[4717]: I0217 14:55:47.522967 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zgc8l" event={"ID":"61fce349-1321-4c5c-9399-e1b305f505e9","Type":"ContainerDied","Data":"ad70588b0505bef3caea0719c43968399fce8016d113a726540d0388cf50e5e7"} Feb 17 14:55:47 crc kubenswrapper[4717]: I0217 14:55:47.532159 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dsx5v" event={"ID":"c61c5e17-781e-42f4-852d-0e0604721b86","Type":"ContainerStarted","Data":"6a0be846ae39e6e31c598632bcab84580d85b50efb092094f84ebe0f80bdd9ac"} Feb 17 14:55:47 crc kubenswrapper[4717]: I0217 14:55:47.540477 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlcvs" event={"ID":"9bed5698-7599-46f7-9fbe-44bd48e5a185","Type":"ContainerStarted","Data":"ae4e25f1c5fe7a729a6018797cf7398a564e697db072d01af263ad245fdde0f3"} Feb 17 14:55:47 crc kubenswrapper[4717]: I0217 14:55:47.546384 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2gw7" event={"ID":"f91b518f-7204-4a6b-bcfb-bc763ab14d2b","Type":"ContainerStarted","Data":"c65edb7a4a9e69fd801c9e7a08ed2ee3ccd50bce7071b2c71049f23861ae672b"} Feb 17 14:55:47 crc kubenswrapper[4717]: I0217 14:55:47.638194 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p2gw7" podStartSLOduration=2.105117632 podStartE2EDuration="58.638164261s" podCreationTimestamp="2026-02-17 14:54:49 +0000 UTC" firstStartedPulling="2026-02-17 14:54:50.44410143 +0000 UTC m=+156.859941906" lastFinishedPulling="2026-02-17 14:55:46.977148059 +0000 UTC m=+213.392988535" observedRunningTime="2026-02-17 14:55:47.637939621 +0000 UTC m=+214.053780107" watchObservedRunningTime="2026-02-17 14:55:47.638164261 +0000 UTC m=+214.054004737" Feb 17 14:55:48 crc kubenswrapper[4717]: I0217 14:55:48.564927 4717 generic.go:334] "Generic (PLEG): container finished" podID="9bed5698-7599-46f7-9fbe-44bd48e5a185" containerID="ae4e25f1c5fe7a729a6018797cf7398a564e697db072d01af263ad245fdde0f3" exitCode=0 Feb 17 14:55:48 crc kubenswrapper[4717]: I0217 14:55:48.565006 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlcvs" event={"ID":"9bed5698-7599-46f7-9fbe-44bd48e5a185","Type":"ContainerDied","Data":"ae4e25f1c5fe7a729a6018797cf7398a564e697db072d01af263ad245fdde0f3"} Feb 17 14:55:48 crc kubenswrapper[4717]: I0217 14:55:48.567594 4717 generic.go:334] "Generic (PLEG): container finished" podID="c61c5e17-781e-42f4-852d-0e0604721b86" containerID="6a0be846ae39e6e31c598632bcab84580d85b50efb092094f84ebe0f80bdd9ac" exitCode=0 Feb 17 14:55:48 crc kubenswrapper[4717]: I0217 14:55:48.567652 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dsx5v" event={"ID":"c61c5e17-781e-42f4-852d-0e0604721b86","Type":"ContainerDied","Data":"6a0be846ae39e6e31c598632bcab84580d85b50efb092094f84ebe0f80bdd9ac"} Feb 17 14:55:48 crc kubenswrapper[4717]: I0217 14:55:48.573337 4717 generic.go:334] "Generic (PLEG): container finished" podID="2c492899-c1df-4839-b6db-18634da0dbb6" containerID="103ede5088aa0f24285e59439bada8d83670e95baa654c45c8f28994255c5df5" exitCode=0 Feb 17 14:55:48 crc kubenswrapper[4717]: I0217 14:55:48.573393 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7hv9" event={"ID":"2c492899-c1df-4839-b6db-18634da0dbb6","Type":"ContainerDied","Data":"103ede5088aa0f24285e59439bada8d83670e95baa654c45c8f28994255c5df5"} Feb 17 14:55:48 crc kubenswrapper[4717]: I0217 14:55:48.573471 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7hv9" event={"ID":"2c492899-c1df-4839-b6db-18634da0dbb6","Type":"ContainerStarted","Data":"1a2a874b9a0b599faf179a1a17e46ff0be4ffebc5c02cbc077aace5fc664351c"} Feb 17 14:55:48 crc kubenswrapper[4717]: I0217 14:55:48.576286 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zgc8l" event={"ID":"61fce349-1321-4c5c-9399-e1b305f505e9","Type":"ContainerStarted","Data":"b8edb95cc7b4f5d69fa83e8b8ad283e5c008bfa14cc054de1408f439b119dea9"} Feb 17 14:55:48 crc kubenswrapper[4717]: I0217 14:55:48.623063 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r7hv9" podStartSLOduration=3.052188093 podStartE2EDuration="1m1.623028606s" podCreationTimestamp="2026-02-17 14:54:47 +0000 UTC" firstStartedPulling="2026-02-17 14:54:49.386979503 +0000 UTC m=+155.802819979" lastFinishedPulling="2026-02-17 14:55:47.957820016 +0000 UTC m=+214.373660492" observedRunningTime="2026-02-17 14:55:48.617654498 +0000 UTC m=+215.033494964" watchObservedRunningTime="2026-02-17 14:55:48.623028606 +0000 UTC m=+215.038869092" Feb 17 14:55:48 crc kubenswrapper[4717]: I0217 14:55:48.708856 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zgc8l" podStartSLOduration=3.735998221 podStartE2EDuration="59.708822937s" podCreationTimestamp="2026-02-17 14:54:49 +0000 UTC" firstStartedPulling="2026-02-17 14:54:51.961406849 +0000 UTC m=+158.377247325" lastFinishedPulling="2026-02-17 14:55:47.934231565 +0000 UTC m=+214.350072041" observedRunningTime="2026-02-17 14:55:48.705749436 +0000 UTC m=+215.121589912" watchObservedRunningTime="2026-02-17 14:55:48.708822937 +0000 UTC m=+215.124663413" Feb 17 14:55:49 crc kubenswrapper[4717]: I0217 14:55:49.585483 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dsx5v" event={"ID":"c61c5e17-781e-42f4-852d-0e0604721b86","Type":"ContainerStarted","Data":"a7d0f93f2303fa6c27c9343bf49fab182a1cfc956a3d3c1881a03811652bf863"} Feb 17 14:55:49 crc kubenswrapper[4717]: I0217 14:55:49.589765 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlcvs" event={"ID":"9bed5698-7599-46f7-9fbe-44bd48e5a185","Type":"ContainerStarted","Data":"f7c04384ea26ba92109a004a7e488aa1d3d27ec710cc40d7e45405c97b3297c0"} Feb 17 14:55:49 crc kubenswrapper[4717]: I0217 14:55:49.613943 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dsx5v" podStartSLOduration=3.617833379 podStartE2EDuration="59.613914167s" podCreationTimestamp="2026-02-17 14:54:50 +0000 UTC" firstStartedPulling="2026-02-17 14:54:52.979064091 +0000 UTC m=+159.394904567" lastFinishedPulling="2026-02-17 14:55:48.975144879 +0000 UTC m=+215.390985355" observedRunningTime="2026-02-17 14:55:49.610867368 +0000 UTC m=+216.026707864" watchObservedRunningTime="2026-02-17 14:55:49.613914167 +0000 UTC m=+216.029754643" Feb 17 14:55:49 crc kubenswrapper[4717]: I0217 14:55:49.634935 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qlcvs" podStartSLOduration=3.664339294 podStartE2EDuration="59.634905887s" podCreationTimestamp="2026-02-17 14:54:50 +0000 UTC" firstStartedPulling="2026-02-17 14:54:52.995167717 +0000 UTC m=+159.411008193" lastFinishedPulling="2026-02-17 14:55:48.96573432 +0000 UTC m=+215.381574786" observedRunningTime="2026-02-17 14:55:49.631183119 +0000 UTC m=+216.047023625" watchObservedRunningTime="2026-02-17 14:55:49.634905887 +0000 UTC m=+216.050746363" Feb 17 14:55:49 crc kubenswrapper[4717]: I0217 14:55:49.727284 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:55:49 crc kubenswrapper[4717]: I0217 14:55:49.727385 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:55:49 crc kubenswrapper[4717]: I0217 14:55:49.782219 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:55:50 crc kubenswrapper[4717]: I0217 14:55:50.187810 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:55:50 crc kubenswrapper[4717]: I0217 14:55:50.188305 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:55:50 crc kubenswrapper[4717]: I0217 14:55:50.808777 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:55:50 crc kubenswrapper[4717]: I0217 14:55:50.808854 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:55:50 crc kubenswrapper[4717]: I0217 14:55:50.808912 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:55:50 crc kubenswrapper[4717]: I0217 14:55:50.809688 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:55:50 crc kubenswrapper[4717]: I0217 14:55:50.809822 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079" gracePeriod=600 Feb 17 14:55:50 crc kubenswrapper[4717]: I0217 14:55:50.886059 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:55:50 crc kubenswrapper[4717]: I0217 14:55:50.886153 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:55:51 crc kubenswrapper[4717]: I0217 14:55:51.266158 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-zgc8l" podUID="61fce349-1321-4c5c-9399-e1b305f505e9" containerName="registry-server" probeResult="failure" output=< Feb 17 14:55:51 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 17 14:55:51 crc kubenswrapper[4717]: > Feb 17 14:55:51 crc kubenswrapper[4717]: I0217 14:55:51.327686 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:55:51 crc kubenswrapper[4717]: I0217 14:55:51.327754 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:55:51 crc kubenswrapper[4717]: I0217 14:55:51.603530 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079" exitCode=0 Feb 17 14:55:51 crc kubenswrapper[4717]: I0217 14:55:51.603583 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079"} Feb 17 14:55:51 crc kubenswrapper[4717]: I0217 14:55:51.603620 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"4beac6debb80927cd51e51b507515d3c5694b27b3026cda68a7eb6f2c64a5fbd"} Feb 17 14:55:51 crc kubenswrapper[4717]: I0217 14:55:51.922263 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dsx5v" podUID="c61c5e17-781e-42f4-852d-0e0604721b86" containerName="registry-server" probeResult="failure" output=< Feb 17 14:55:51 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 17 14:55:51 crc kubenswrapper[4717]: > Feb 17 14:55:52 crc kubenswrapper[4717]: I0217 14:55:52.373640 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qlcvs" podUID="9bed5698-7599-46f7-9fbe-44bd48e5a185" containerName="registry-server" probeResult="failure" output=< Feb 17 14:55:52 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 17 14:55:52 crc kubenswrapper[4717]: > Feb 17 14:55:58 crc kubenswrapper[4717]: I0217 14:55:58.210790 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:55:58 crc kubenswrapper[4717]: I0217 14:55:58.212458 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:55:58 crc kubenswrapper[4717]: I0217 14:55:58.217397 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6lpz6"] Feb 17 14:55:58 crc kubenswrapper[4717]: I0217 14:55:58.295016 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:55:58 crc kubenswrapper[4717]: I0217 14:55:58.690027 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:55:58 crc kubenswrapper[4717]: I0217 14:55:58.741202 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7hv9"] Feb 17 14:55:59 crc kubenswrapper[4717]: I0217 14:55:59.781246 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:56:00 crc kubenswrapper[4717]: I0217 14:56:00.231689 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:56:00 crc kubenswrapper[4717]: I0217 14:56:00.273122 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:56:00 crc kubenswrapper[4717]: I0217 14:56:00.667129 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r7hv9" podUID="2c492899-c1df-4839-b6db-18634da0dbb6" containerName="registry-server" containerID="cri-o://1a2a874b9a0b599faf179a1a17e46ff0be4ffebc5c02cbc077aace5fc664351c" gracePeriod=2 Feb 17 14:56:00 crc kubenswrapper[4717]: I0217 14:56:00.941027 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zgc8l"] Feb 17 14:56:00 crc kubenswrapper[4717]: I0217 14:56:00.945878 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:56:01 crc kubenswrapper[4717]: I0217 14:56:01.011834 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:56:01 crc kubenswrapper[4717]: I0217 14:56:01.379774 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:56:01 crc kubenswrapper[4717]: I0217 14:56:01.431361 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:56:01 crc kubenswrapper[4717]: I0217 14:56:01.678656 4717 generic.go:334] "Generic (PLEG): container finished" podID="2c492899-c1df-4839-b6db-18634da0dbb6" containerID="1a2a874b9a0b599faf179a1a17e46ff0be4ffebc5c02cbc077aace5fc664351c" exitCode=0 Feb 17 14:56:01 crc kubenswrapper[4717]: I0217 14:56:01.678815 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7hv9" event={"ID":"2c492899-c1df-4839-b6db-18634da0dbb6","Type":"ContainerDied","Data":"1a2a874b9a0b599faf179a1a17e46ff0be4ffebc5c02cbc077aace5fc664351c"} Feb 17 14:56:01 crc kubenswrapper[4717]: I0217 14:56:01.679024 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zgc8l" podUID="61fce349-1321-4c5c-9399-e1b305f505e9" containerName="registry-server" containerID="cri-o://b8edb95cc7b4f5d69fa83e8b8ad283e5c008bfa14cc054de1408f439b119dea9" gracePeriod=2 Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.678508 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.689716 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7hv9" event={"ID":"2c492899-c1df-4839-b6db-18634da0dbb6","Type":"ContainerDied","Data":"da9c76726953dd889e74e3a98b23d0d5aa70eb2ca05ff5f880fa115b3178cc5e"} Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.689783 4717 scope.go:117] "RemoveContainer" containerID="1a2a874b9a0b599faf179a1a17e46ff0be4ffebc5c02cbc077aace5fc664351c" Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.689923 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7hv9" Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.722987 4717 scope.go:117] "RemoveContainer" containerID="103ede5088aa0f24285e59439bada8d83670e95baa654c45c8f28994255c5df5" Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.745919 4717 scope.go:117] "RemoveContainer" containerID="e79a631850580ab2bc3bbc6199089f7b91027c4065a628c7ba242cd84c5107ec" Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.773597 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c492899-c1df-4839-b6db-18634da0dbb6-utilities\") pod \"2c492899-c1df-4839-b6db-18634da0dbb6\" (UID: \"2c492899-c1df-4839-b6db-18634da0dbb6\") " Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.773663 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c492899-c1df-4839-b6db-18634da0dbb6-catalog-content\") pod \"2c492899-c1df-4839-b6db-18634da0dbb6\" (UID: \"2c492899-c1df-4839-b6db-18634da0dbb6\") " Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.773699 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-946vm\" (UniqueName: \"kubernetes.io/projected/2c492899-c1df-4839-b6db-18634da0dbb6-kube-api-access-946vm\") pod \"2c492899-c1df-4839-b6db-18634da0dbb6\" (UID: \"2c492899-c1df-4839-b6db-18634da0dbb6\") " Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.775468 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c492899-c1df-4839-b6db-18634da0dbb6-utilities" (OuterVolumeSpecName: "utilities") pod "2c492899-c1df-4839-b6db-18634da0dbb6" (UID: "2c492899-c1df-4839-b6db-18634da0dbb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.781524 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c492899-c1df-4839-b6db-18634da0dbb6-kube-api-access-946vm" (OuterVolumeSpecName: "kube-api-access-946vm") pod "2c492899-c1df-4839-b6db-18634da0dbb6" (UID: "2c492899-c1df-4839-b6db-18634da0dbb6"). InnerVolumeSpecName "kube-api-access-946vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.835990 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c492899-c1df-4839-b6db-18634da0dbb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c492899-c1df-4839-b6db-18634da0dbb6" (UID: "2c492899-c1df-4839-b6db-18634da0dbb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.875733 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c492899-c1df-4839-b6db-18634da0dbb6-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.875814 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c492899-c1df-4839-b6db-18634da0dbb6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:02 crc kubenswrapper[4717]: I0217 14:56:02.875835 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-946vm\" (UniqueName: \"kubernetes.io/projected/2c492899-c1df-4839-b6db-18634da0dbb6-kube-api-access-946vm\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.028943 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7hv9"] Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.032909 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r7hv9"] Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.349928 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qlcvs"] Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.350410 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qlcvs" podUID="9bed5698-7599-46f7-9fbe-44bd48e5a185" containerName="registry-server" containerID="cri-o://f7c04384ea26ba92109a004a7e488aa1d3d27ec710cc40d7e45405c97b3297c0" gracePeriod=2 Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.701131 4717 generic.go:334] "Generic (PLEG): container finished" podID="9bed5698-7599-46f7-9fbe-44bd48e5a185" containerID="f7c04384ea26ba92109a004a7e488aa1d3d27ec710cc40d7e45405c97b3297c0" exitCode=0 Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.701379 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlcvs" event={"ID":"9bed5698-7599-46f7-9fbe-44bd48e5a185","Type":"ContainerDied","Data":"f7c04384ea26ba92109a004a7e488aa1d3d27ec710cc40d7e45405c97b3297c0"} Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.705657 4717 generic.go:334] "Generic (PLEG): container finished" podID="61fce349-1321-4c5c-9399-e1b305f505e9" containerID="b8edb95cc7b4f5d69fa83e8b8ad283e5c008bfa14cc054de1408f439b119dea9" exitCode=0 Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.705691 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zgc8l" event={"ID":"61fce349-1321-4c5c-9399-e1b305f505e9","Type":"ContainerDied","Data":"b8edb95cc7b4f5d69fa83e8b8ad283e5c008bfa14cc054de1408f439b119dea9"} Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.705713 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zgc8l" event={"ID":"61fce349-1321-4c5c-9399-e1b305f505e9","Type":"ContainerDied","Data":"74b444756019b44d5c8c17605d89c6f0b84c3ba7d2206c4672c56715d7df5751"} Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.705726 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74b444756019b44d5c8c17605d89c6f0b84c3ba7d2206c4672c56715d7df5751" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.735966 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.855455 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c492899-c1df-4839-b6db-18634da0dbb6" path="/var/lib/kubelet/pods/2c492899-c1df-4839-b6db-18634da0dbb6/volumes" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.882268 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.888691 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwb5q\" (UniqueName: \"kubernetes.io/projected/61fce349-1321-4c5c-9399-e1b305f505e9-kube-api-access-xwb5q\") pod \"61fce349-1321-4c5c-9399-e1b305f505e9\" (UID: \"61fce349-1321-4c5c-9399-e1b305f505e9\") " Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.888832 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61fce349-1321-4c5c-9399-e1b305f505e9-utilities\") pod \"61fce349-1321-4c5c-9399-e1b305f505e9\" (UID: \"61fce349-1321-4c5c-9399-e1b305f505e9\") " Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.889231 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61fce349-1321-4c5c-9399-e1b305f505e9-catalog-content\") pod \"61fce349-1321-4c5c-9399-e1b305f505e9\" (UID: \"61fce349-1321-4c5c-9399-e1b305f505e9\") " Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.889521 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bed5698-7599-46f7-9fbe-44bd48e5a185-utilities\") pod \"9bed5698-7599-46f7-9fbe-44bd48e5a185\" (UID: \"9bed5698-7599-46f7-9fbe-44bd48e5a185\") " Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.889597 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bed5698-7599-46f7-9fbe-44bd48e5a185-catalog-content\") pod \"9bed5698-7599-46f7-9fbe-44bd48e5a185\" (UID: \"9bed5698-7599-46f7-9fbe-44bd48e5a185\") " Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.891514 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bed5698-7599-46f7-9fbe-44bd48e5a185-utilities" (OuterVolumeSpecName: "utilities") pod "9bed5698-7599-46f7-9fbe-44bd48e5a185" (UID: "9bed5698-7599-46f7-9fbe-44bd48e5a185"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.892584 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61fce349-1321-4c5c-9399-e1b305f505e9-kube-api-access-xwb5q" (OuterVolumeSpecName: "kube-api-access-xwb5q") pod "61fce349-1321-4c5c-9399-e1b305f505e9" (UID: "61fce349-1321-4c5c-9399-e1b305f505e9"). InnerVolumeSpecName "kube-api-access-xwb5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.892597 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61fce349-1321-4c5c-9399-e1b305f505e9-utilities" (OuterVolumeSpecName: "utilities") pod "61fce349-1321-4c5c-9399-e1b305f505e9" (UID: "61fce349-1321-4c5c-9399-e1b305f505e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.921198 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61fce349-1321-4c5c-9399-e1b305f505e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61fce349-1321-4c5c-9399-e1b305f505e9" (UID: "61fce349-1321-4c5c-9399-e1b305f505e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.990567 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ldh8\" (UniqueName: \"kubernetes.io/projected/9bed5698-7599-46f7-9fbe-44bd48e5a185-kube-api-access-9ldh8\") pod \"9bed5698-7599-46f7-9fbe-44bd48e5a185\" (UID: \"9bed5698-7599-46f7-9fbe-44bd48e5a185\") " Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.990771 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bed5698-7599-46f7-9fbe-44bd48e5a185-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.990787 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwb5q\" (UniqueName: \"kubernetes.io/projected/61fce349-1321-4c5c-9399-e1b305f505e9-kube-api-access-xwb5q\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.990801 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61fce349-1321-4c5c-9399-e1b305f505e9-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.990813 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61fce349-1321-4c5c-9399-e1b305f505e9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:03 crc kubenswrapper[4717]: I0217 14:56:03.994324 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bed5698-7599-46f7-9fbe-44bd48e5a185-kube-api-access-9ldh8" (OuterVolumeSpecName: "kube-api-access-9ldh8") pod "9bed5698-7599-46f7-9fbe-44bd48e5a185" (UID: "9bed5698-7599-46f7-9fbe-44bd48e5a185"). InnerVolumeSpecName "kube-api-access-9ldh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.035701 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bed5698-7599-46f7-9fbe-44bd48e5a185-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bed5698-7599-46f7-9fbe-44bd48e5a185" (UID: "9bed5698-7599-46f7-9fbe-44bd48e5a185"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.091928 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bed5698-7599-46f7-9fbe-44bd48e5a185-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.091959 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ldh8\" (UniqueName: \"kubernetes.io/projected/9bed5698-7599-46f7-9fbe-44bd48e5a185-kube-api-access-9ldh8\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.716370 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qlcvs" event={"ID":"9bed5698-7599-46f7-9fbe-44bd48e5a185","Type":"ContainerDied","Data":"773f1e89167e261ad0d8fc5a9a8a4693a55314c3524cca9e23fa2c3fa699b7b9"} Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.716935 4717 scope.go:117] "RemoveContainer" containerID="f7c04384ea26ba92109a004a7e488aa1d3d27ec710cc40d7e45405c97b3297c0" Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.716391 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qlcvs" Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.716391 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zgc8l" Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.740657 4717 scope.go:117] "RemoveContainer" containerID="ae4e25f1c5fe7a729a6018797cf7398a564e697db072d01af263ad245fdde0f3" Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.751404 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qlcvs"] Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.754744 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qlcvs"] Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.906704 4717 scope.go:117] "RemoveContainer" containerID="98575684feef993a86f0e968216d45dc069aa5b15dcc272f7039c510a6844a4e" Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.926837 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zgc8l"] Feb 17 14:56:04 crc kubenswrapper[4717]: I0217 14:56:04.932906 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zgc8l"] Feb 17 14:56:05 crc kubenswrapper[4717]: I0217 14:56:05.855430 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61fce349-1321-4c5c-9399-e1b305f505e9" path="/var/lib/kubelet/pods/61fce349-1321-4c5c-9399-e1b305f505e9/volumes" Feb 17 14:56:05 crc kubenswrapper[4717]: I0217 14:56:05.856607 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bed5698-7599-46f7-9fbe-44bd48e5a185" path="/var/lib/kubelet/pods/9bed5698-7599-46f7-9fbe-44bd48e5a185/volumes" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.445358 4717 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.445948 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" containerName="extract-content" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.445967 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" containerName="extract-content" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.445979 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61fce349-1321-4c5c-9399-e1b305f505e9" containerName="extract-utilities" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.445987 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="61fce349-1321-4c5c-9399-e1b305f505e9" containerName="extract-utilities" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.445996 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61fce349-1321-4c5c-9399-e1b305f505e9" containerName="extract-content" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446004 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="61fce349-1321-4c5c-9399-e1b305f505e9" containerName="extract-content" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.446017 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" containerName="extract-utilities" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446025 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" containerName="extract-utilities" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.446033 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" containerName="registry-server" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446040 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" containerName="registry-server" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.446053 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bed5698-7599-46f7-9fbe-44bd48e5a185" containerName="extract-content" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446061 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bed5698-7599-46f7-9fbe-44bd48e5a185" containerName="extract-content" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.446070 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61fce349-1321-4c5c-9399-e1b305f505e9" containerName="registry-server" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446094 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="61fce349-1321-4c5c-9399-e1b305f505e9" containerName="registry-server" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.446108 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c492899-c1df-4839-b6db-18634da0dbb6" containerName="extract-utilities" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446116 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c492899-c1df-4839-b6db-18634da0dbb6" containerName="extract-utilities" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.446128 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c492899-c1df-4839-b6db-18634da0dbb6" containerName="extract-content" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446135 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c492899-c1df-4839-b6db-18634da0dbb6" containerName="extract-content" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.446144 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c492899-c1df-4839-b6db-18634da0dbb6" containerName="registry-server" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446153 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c492899-c1df-4839-b6db-18634da0dbb6" containerName="registry-server" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.446161 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bed5698-7599-46f7-9fbe-44bd48e5a185" containerName="registry-server" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446170 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bed5698-7599-46f7-9fbe-44bd48e5a185" containerName="registry-server" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.446178 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bed5698-7599-46f7-9fbe-44bd48e5a185" containerName="extract-utilities" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446187 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bed5698-7599-46f7-9fbe-44bd48e5a185" containerName="extract-utilities" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446311 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1816dd9-8f4c-4667-bdbf-7c6d04e325b5" containerName="registry-server" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446328 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bed5698-7599-46f7-9fbe-44bd48e5a185" containerName="registry-server" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446340 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c492899-c1df-4839-b6db-18634da0dbb6" containerName="registry-server" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446348 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="61fce349-1321-4c5c-9399-e1b305f505e9" containerName="registry-server" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446766 4717 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446902 4717 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.446972 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447070 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685" gracePeriod=15 Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447260 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8" gracePeriod=15 Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447336 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b" gracePeriod=15 Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447450 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034" gracePeriod=15 Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447442 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034" gracePeriod=15 Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.447614 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447635 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.447646 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447653 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.447665 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447679 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.447687 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447692 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.447701 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447707 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.447716 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447723 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.447734 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447740 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.447951 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.448040 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.448050 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.448057 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.448064 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.448072 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.448098 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:56:13 crc kubenswrapper[4717]: E0217 14:56:13.448229 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.448247 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.454904 4717 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.487814 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.488875 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.489177 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.489456 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.489720 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.489991 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.490325 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.490568 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591538 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591587 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591641 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591666 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591697 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591698 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591752 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591792 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591774 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591721 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591771 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591877 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591927 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.591979 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.592006 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.592112 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.776911 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.778392 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:56:13 crc kubenswrapper[4717]: I0217 14:56:13.779313 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b" exitCode=2 Feb 17 14:56:14 crc kubenswrapper[4717]: I0217 14:56:14.787238 4717 generic.go:334] "Generic (PLEG): container finished" podID="5b3f83fb-6861-461b-a7de-5968ce2089fd" containerID="4f245c2e9bb51be7827de3b9b8123998d802ae58bb16669ff3a3e2018c94300f" exitCode=0 Feb 17 14:56:14 crc kubenswrapper[4717]: I0217 14:56:14.787302 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5b3f83fb-6861-461b-a7de-5968ce2089fd","Type":"ContainerDied","Data":"4f245c2e9bb51be7827de3b9b8123998d802ae58bb16669ff3a3e2018c94300f"} Feb 17 14:56:14 crc kubenswrapper[4717]: I0217 14:56:14.788519 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:14 crc kubenswrapper[4717]: I0217 14:56:14.789707 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 14:56:14 crc kubenswrapper[4717]: I0217 14:56:14.791149 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:56:14 crc kubenswrapper[4717]: I0217 14:56:14.791742 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8" exitCode=0 Feb 17 14:56:14 crc kubenswrapper[4717]: I0217 14:56:14.791771 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034" exitCode=0 Feb 17 14:56:14 crc kubenswrapper[4717]: I0217 14:56:14.791783 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034" exitCode=0 Feb 17 14:56:14 crc kubenswrapper[4717]: I0217 14:56:14.791833 4717 scope.go:117] "RemoveContainer" containerID="2ed13f58aa42583e826479cabebd118937febb863731faaeafba21828884fc23" Feb 17 14:56:15 crc kubenswrapper[4717]: I0217 14:56:15.800248 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:56:15 crc kubenswrapper[4717]: I0217 14:56:15.802047 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685" exitCode=0 Feb 17 14:56:15 crc kubenswrapper[4717]: I0217 14:56:15.855271 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.024509 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.025648 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.027791 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3f83fb-6861-461b-a7de-5968ce2089fd-kube-api-access\") pod \"5b3f83fb-6861-461b-a7de-5968ce2089fd\" (UID: \"5b3f83fb-6861-461b-a7de-5968ce2089fd\") " Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.027861 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3f83fb-6861-461b-a7de-5968ce2089fd-kubelet-dir\") pod \"5b3f83fb-6861-461b-a7de-5968ce2089fd\" (UID: \"5b3f83fb-6861-461b-a7de-5968ce2089fd\") " Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.027933 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5b3f83fb-6861-461b-a7de-5968ce2089fd-var-lock\") pod \"5b3f83fb-6861-461b-a7de-5968ce2089fd\" (UID: \"5b3f83fb-6861-461b-a7de-5968ce2089fd\") " Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.027964 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b3f83fb-6861-461b-a7de-5968ce2089fd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5b3f83fb-6861-461b-a7de-5968ce2089fd" (UID: "5b3f83fb-6861-461b-a7de-5968ce2089fd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.028039 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b3f83fb-6861-461b-a7de-5968ce2089fd-var-lock" (OuterVolumeSpecName: "var-lock") pod "5b3f83fb-6861-461b-a7de-5968ce2089fd" (UID: "5b3f83fb-6861-461b-a7de-5968ce2089fd"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.028233 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3f83fb-6861-461b-a7de-5968ce2089fd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.028266 4717 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5b3f83fb-6861-461b-a7de-5968ce2089fd-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.033742 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3f83fb-6861-461b-a7de-5968ce2089fd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5b3f83fb-6861-461b-a7de-5968ce2089fd" (UID: "5b3f83fb-6861-461b-a7de-5968ce2089fd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.129615 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3f83fb-6861-461b-a7de-5968ce2089fd-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.441431 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.442248 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.442943 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.443189 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.635544 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.635956 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.636000 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.635642 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.636060 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.636184 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.636861 4717 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.636889 4717 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.636899 4717 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.810662 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5b3f83fb-6861-461b-a7de-5968ce2089fd","Type":"ContainerDied","Data":"1fc76fcf7847b9e3f66cbca0f88df2cfcd95e8b16ce156d7f8478d002dd46987"} Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.810711 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fc76fcf7847b9e3f66cbca0f88df2cfcd95e8b16ce156d7f8478d002dd46987" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.810769 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.817801 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.818646 4717 scope.go:117] "RemoveContainer" containerID="cf697ac5255736d0957e65ac56946a786e3250f17055d2884008b0407d466eb8" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.818696 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.831356 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.831670 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.838196 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.838742 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.841667 4717 scope.go:117] "RemoveContainer" containerID="b0ad302022d68a0ca4495a90602ad70eef66cc2eb31eb4f5b52a608b1380f034" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.856998 4717 scope.go:117] "RemoveContainer" containerID="878862c7e82eb192006e48b9ec47097ceb9c7b08e3c1dda4951c1e25969b5034" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.872449 4717 scope.go:117] "RemoveContainer" containerID="287e0617df08d220a654471943181ac2179f8735461b98fc1fb9f9c1a5cc119b" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.885867 4717 scope.go:117] "RemoveContainer" containerID="a643779c2b6f77cad82de81545f9035884fd693b893d309bcca7ef3ed8ae4685" Feb 17 14:56:16 crc kubenswrapper[4717]: I0217 14:56:16.900119 4717 scope.go:117] "RemoveContainer" containerID="a3da60692660c0ea735bbdc539867c3d31619ebc95a3c26cf30e1b1effb0373e" Feb 17 14:56:16 crc kubenswrapper[4717]: E0217 14:56:16.941326 4717 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.74:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" volumeName="registry-storage" Feb 17 14:56:17 crc kubenswrapper[4717]: I0217 14:56:17.854494 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 17 14:56:18 crc kubenswrapper[4717]: E0217 14:56:18.491189 4717 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.74:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:18 crc kubenswrapper[4717]: I0217 14:56:18.492039 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:18 crc kubenswrapper[4717]: W0217 14:56:18.516567 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-f08faa177760823230a6b701e246642a5482c2cc0454a6a5ef6799ffb73aa452 WatchSource:0}: Error finding container f08faa177760823230a6b701e246642a5482c2cc0454a6a5ef6799ffb73aa452: Status 404 returned error can't find the container with id f08faa177760823230a6b701e246642a5482c2cc0454a6a5ef6799ffb73aa452 Feb 17 14:56:18 crc kubenswrapper[4717]: E0217 14:56:18.520545 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.74:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18951081e510bffe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 14:56:18.520063998 +0000 UTC m=+244.935904464,LastTimestamp:2026-02-17 14:56:18.520063998 +0000 UTC m=+244.935904464,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 14:56:18 crc kubenswrapper[4717]: I0217 14:56:18.834984 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bade48f91e1f7170476b32ad9dc36444531f6397bb27f763413016f5c2bb217c"} Feb 17 14:56:18 crc kubenswrapper[4717]: I0217 14:56:18.835402 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f08faa177760823230a6b701e246642a5482c2cc0454a6a5ef6799ffb73aa452"} Feb 17 14:56:18 crc kubenswrapper[4717]: E0217 14:56:18.836054 4717 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.74:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:56:18 crc kubenswrapper[4717]: I0217 14:56:18.836143 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:20 crc kubenswrapper[4717]: E0217 14:56:20.215165 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.74:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18951081e510bffe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 14:56:18.520063998 +0000 UTC m=+244.935904464,LastTimestamp:2026-02-17 14:56:18.520063998 +0000 UTC m=+244.935904464,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 14:56:22 crc kubenswrapper[4717]: E0217 14:56:22.819576 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:22 crc kubenswrapper[4717]: E0217 14:56:22.819888 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:22 crc kubenswrapper[4717]: E0217 14:56:22.820200 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:22 crc kubenswrapper[4717]: E0217 14:56:22.820648 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:22 crc kubenswrapper[4717]: E0217 14:56:22.820996 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:22 crc kubenswrapper[4717]: I0217 14:56:22.821058 4717 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 17 14:56:22 crc kubenswrapper[4717]: E0217 14:56:22.821427 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="200ms" Feb 17 14:56:23 crc kubenswrapper[4717]: E0217 14:56:23.022466 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="400ms" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.256706 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" podUID="eee77ab9-8247-4813-8137-d627a31c8840" containerName="oauth-openshift" containerID="cri-o://e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0" gracePeriod=15 Feb 17 14:56:23 crc kubenswrapper[4717]: E0217 14:56:23.424237 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="800ms" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.677790 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.679533 4717 status_manager.go:851] "Failed to get status for pod" podUID="eee77ab9-8247-4813-8137-d627a31c8840" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6lpz6\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.680499 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.787578 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-idp-0-file-data\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788165 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-service-ca\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788197 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-router-certs\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788224 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-session\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788252 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-login\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788289 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-provider-selection\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788326 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-cliconfig\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788359 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-audit-policies\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788400 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-trusted-ca-bundle\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788443 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-serving-cert\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788478 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-ocp-branding-template\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788526 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-error\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788651 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d6cx\" (UniqueName: \"kubernetes.io/projected/eee77ab9-8247-4813-8137-d627a31c8840-kube-api-access-6d6cx\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.788676 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eee77ab9-8247-4813-8137-d627a31c8840-audit-dir\") pod \"eee77ab9-8247-4813-8137-d627a31c8840\" (UID: \"eee77ab9-8247-4813-8137-d627a31c8840\") " Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.789244 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.789283 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.789744 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.789937 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eee77ab9-8247-4813-8137-d627a31c8840-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.790044 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.790330 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.790354 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.790367 4717 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.790381 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.790397 4717 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eee77ab9-8247-4813-8137-d627a31c8840-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.796415 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.836073 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee77ab9-8247-4813-8137-d627a31c8840-kube-api-access-6d6cx" (OuterVolumeSpecName: "kube-api-access-6d6cx") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "kube-api-access-6d6cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.837339 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.838679 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.839472 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.839905 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.840325 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.840541 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.841475 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "eee77ab9-8247-4813-8137-d627a31c8840" (UID: "eee77ab9-8247-4813-8137-d627a31c8840"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.876587 4717 generic.go:334] "Generic (PLEG): container finished" podID="eee77ab9-8247-4813-8137-d627a31c8840" containerID="e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0" exitCode=0 Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.876637 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" event={"ID":"eee77ab9-8247-4813-8137-d627a31c8840","Type":"ContainerDied","Data":"e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0"} Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.876673 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" event={"ID":"eee77ab9-8247-4813-8137-d627a31c8840","Type":"ContainerDied","Data":"f6603c8f36fde83763130015cdc80da5c9f642485428f0e328f4ddc9728e66d3"} Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.876700 4717 scope.go:117] "RemoveContainer" containerID="e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.876843 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.877527 4717 status_manager.go:851] "Failed to get status for pod" podUID="eee77ab9-8247-4813-8137-d627a31c8840" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6lpz6\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.878227 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.891059 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.891173 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.891190 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.891203 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d6cx\" (UniqueName: \"kubernetes.io/projected/eee77ab9-8247-4813-8137-d627a31c8840-kube-api-access-6d6cx\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.891215 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.891226 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.891236 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.891244 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.891253 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eee77ab9-8247-4813-8137-d627a31c8840-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.920410 4717 status_manager.go:851] "Failed to get status for pod" podUID="eee77ab9-8247-4813-8137-d627a31c8840" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6lpz6\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.921260 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.928456 4717 scope.go:117] "RemoveContainer" containerID="e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0" Feb 17 14:56:23 crc kubenswrapper[4717]: E0217 14:56:23.929051 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0\": container with ID starting with e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0 not found: ID does not exist" containerID="e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0" Feb 17 14:56:23 crc kubenswrapper[4717]: I0217 14:56:23.929115 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0"} err="failed to get container status \"e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0\": rpc error: code = NotFound desc = could not find container \"e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0\": container with ID starting with e8ed4d8093b9390896f37c667b6f23166f1b58c6f29fd5dd1fef4a1360e148f0 not found: ID does not exist" Feb 17 14:56:24 crc kubenswrapper[4717]: E0217 14:56:24.225492 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="1.6s" Feb 17 14:56:25 crc kubenswrapper[4717]: E0217 14:56:25.827201 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="3.2s" Feb 17 14:56:25 crc kubenswrapper[4717]: I0217 14:56:25.848701 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:25 crc kubenswrapper[4717]: I0217 14:56:25.849173 4717 status_manager.go:851] "Failed to get status for pod" podUID="eee77ab9-8247-4813-8137-d627a31c8840" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6lpz6\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:26 crc kubenswrapper[4717]: I0217 14:56:26.846206 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:26 crc kubenswrapper[4717]: I0217 14:56:26.848187 4717 status_manager.go:851] "Failed to get status for pod" podUID="eee77ab9-8247-4813-8137-d627a31c8840" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6lpz6\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:26 crc kubenswrapper[4717]: I0217 14:56:26.849246 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:26 crc kubenswrapper[4717]: I0217 14:56:26.865532 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f5c0095d-2b66-4009-a169-addd2dae6c89" Feb 17 14:56:26 crc kubenswrapper[4717]: I0217 14:56:26.865601 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f5c0095d-2b66-4009-a169-addd2dae6c89" Feb 17 14:56:26 crc kubenswrapper[4717]: E0217 14:56:26.866472 4717 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:26 crc kubenswrapper[4717]: I0217 14:56:26.867101 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.912524 4717 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="e882ef2162c2ec660710b2dc21ac413751d9ac5d3e531bbe9877d98fc1952263" exitCode=0 Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.912634 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"e882ef2162c2ec660710b2dc21ac413751d9ac5d3e531bbe9877d98fc1952263"} Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.913509 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bddc8c0ebbf5c53788123e419d0b39dc665823230775408909fe407dc6e7e006"} Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.913857 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f5c0095d-2b66-4009-a169-addd2dae6c89" Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.913874 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f5c0095d-2b66-4009-a169-addd2dae6c89" Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.914556 4717 status_manager.go:851] "Failed to get status for pod" podUID="eee77ab9-8247-4813-8137-d627a31c8840" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6lpz6\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:27 crc kubenswrapper[4717]: E0217 14:56:27.914599 4717 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.915602 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.921038 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.921132 4717 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236" exitCode=1 Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.921180 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236"} Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.921790 4717 scope.go:117] "RemoveContainer" containerID="f228c1b213641fafe62a2ca946717c23921e4ae34447cc109af24300f48b3236" Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.928590 4717 status_manager.go:851] "Failed to get status for pod" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.929495 4717 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:27 crc kubenswrapper[4717]: I0217 14:56:27.930033 4717 status_manager.go:851] "Failed to get status for pod" podUID="eee77ab9-8247-4813-8137-d627a31c8840" pod="openshift-authentication/oauth-openshift-558db77b4-6lpz6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6lpz6\": dial tcp 38.102.83.74:6443: connect: connection refused" Feb 17 14:56:28 crc kubenswrapper[4717]: I0217 14:56:28.201750 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:56:28 crc kubenswrapper[4717]: I0217 14:56:28.928236 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"52f62029f7b14b1ac175228523f05f70fc9518c9c6c5183ac02c7b611d658cd1"} Feb 17 14:56:28 crc kubenswrapper[4717]: I0217 14:56:28.928880 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"88c4f259c403805a005a90b93bbc3f3d8caa8fff0a25343c7c13eb6ae4540fa6"} Feb 17 14:56:28 crc kubenswrapper[4717]: I0217 14:56:28.931403 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 14:56:28 crc kubenswrapper[4717]: I0217 14:56:28.931449 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"20660daf3dea3a42f4129ac8def41003a2859f9c68776f801fac251c83ca065b"} Feb 17 14:56:29 crc kubenswrapper[4717]: I0217 14:56:29.941636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3fd00ed6c7ce1b8d96b63c553c2765eefad5c6557c7234a055fb6fad7113b486"} Feb 17 14:56:29 crc kubenswrapper[4717]: I0217 14:56:29.942144 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2bf8643078674feeb6f7efc670a39ed06cf527c1ac4a6fb56a23acbec2b8b917"} Feb 17 14:56:29 crc kubenswrapper[4717]: I0217 14:56:29.942168 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"69cf783b88679002efe181892985ada172dc63191916744b26a8327b67740529"} Feb 17 14:56:29 crc kubenswrapper[4717]: I0217 14:56:29.942018 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f5c0095d-2b66-4009-a169-addd2dae6c89" Feb 17 14:56:29 crc kubenswrapper[4717]: I0217 14:56:29.942195 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f5c0095d-2b66-4009-a169-addd2dae6c89" Feb 17 14:56:31 crc kubenswrapper[4717]: I0217 14:56:31.867567 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:31 crc kubenswrapper[4717]: I0217 14:56:31.867634 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:31 crc kubenswrapper[4717]: I0217 14:56:31.874247 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:33 crc kubenswrapper[4717]: I0217 14:56:33.737615 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:56:35 crc kubenswrapper[4717]: I0217 14:56:35.083994 4717 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:35 crc kubenswrapper[4717]: I0217 14:56:35.869518 4717 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="53cf81ac-301a-4420-8405-7dbde027690d" Feb 17 14:56:35 crc kubenswrapper[4717]: I0217 14:56:35.983482 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:35 crc kubenswrapper[4717]: I0217 14:56:35.983644 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f5c0095d-2b66-4009-a169-addd2dae6c89" Feb 17 14:56:35 crc kubenswrapper[4717]: I0217 14:56:35.983691 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f5c0095d-2b66-4009-a169-addd2dae6c89" Feb 17 14:56:35 crc kubenswrapper[4717]: I0217 14:56:35.989626 4717 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="53cf81ac-301a-4420-8405-7dbde027690d" Feb 17 14:56:36 crc kubenswrapper[4717]: I0217 14:56:36.991018 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f5c0095d-2b66-4009-a169-addd2dae6c89" Feb 17 14:56:36 crc kubenswrapper[4717]: I0217 14:56:36.991068 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f5c0095d-2b66-4009-a169-addd2dae6c89" Feb 17 14:56:36 crc kubenswrapper[4717]: I0217 14:56:36.995288 4717 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="53cf81ac-301a-4420-8405-7dbde027690d" Feb 17 14:56:37 crc kubenswrapper[4717]: I0217 14:56:37.819455 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:56:37 crc kubenswrapper[4717]: I0217 14:56:37.827611 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:56:43 crc kubenswrapper[4717]: I0217 14:56:43.746023 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:56:44 crc kubenswrapper[4717]: I0217 14:56:44.340793 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 14:56:44 crc kubenswrapper[4717]: I0217 14:56:44.374319 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 14:56:44 crc kubenswrapper[4717]: I0217 14:56:44.954035 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 14:56:45 crc kubenswrapper[4717]: I0217 14:56:45.349736 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 14:56:45 crc kubenswrapper[4717]: I0217 14:56:45.478697 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 14:56:45 crc kubenswrapper[4717]: I0217 14:56:45.681172 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 14:56:45 crc kubenswrapper[4717]: I0217 14:56:45.856843 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 14:56:46 crc kubenswrapper[4717]: I0217 14:56:46.013400 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 14:56:46 crc kubenswrapper[4717]: I0217 14:56:46.059928 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 14:56:46 crc kubenswrapper[4717]: I0217 14:56:46.061404 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 14:56:46 crc kubenswrapper[4717]: I0217 14:56:46.215750 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 14:56:46 crc kubenswrapper[4717]: I0217 14:56:46.515762 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 14:56:46 crc kubenswrapper[4717]: I0217 14:56:46.586536 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 14:56:46 crc kubenswrapper[4717]: I0217 14:56:46.686736 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 14:56:47 crc kubenswrapper[4717]: I0217 14:56:47.067682 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 14:56:47 crc kubenswrapper[4717]: I0217 14:56:47.120847 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 14:56:47 crc kubenswrapper[4717]: I0217 14:56:47.346349 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 14:56:47 crc kubenswrapper[4717]: I0217 14:56:47.568540 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 14:56:47 crc kubenswrapper[4717]: I0217 14:56:47.574550 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 14:56:47 crc kubenswrapper[4717]: I0217 14:56:47.919839 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 14:56:47 crc kubenswrapper[4717]: I0217 14:56:47.956142 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 14:56:48 crc kubenswrapper[4717]: I0217 14:56:48.138921 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 14:56:48 crc kubenswrapper[4717]: I0217 14:56:48.158927 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 14:56:48 crc kubenswrapper[4717]: I0217 14:56:48.353471 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 14:56:48 crc kubenswrapper[4717]: I0217 14:56:48.357562 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 14:56:48 crc kubenswrapper[4717]: I0217 14:56:48.470351 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 14:56:48 crc kubenswrapper[4717]: I0217 14:56:48.522488 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 14:56:48 crc kubenswrapper[4717]: I0217 14:56:48.617497 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 14:56:48 crc kubenswrapper[4717]: I0217 14:56:48.837434 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 14:56:48 crc kubenswrapper[4717]: I0217 14:56:48.912063 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.014903 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.046664 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.078131 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.088484 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.233704 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.270298 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.350009 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.411175 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.417751 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.439168 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.442207 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.537581 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.592797 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.601898 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.640887 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.792235 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.889017 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.925015 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.958015 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.993872 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 14:56:49 crc kubenswrapper[4717]: I0217 14:56:49.998515 4717 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.006187 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-6lpz6"] Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.006358 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.011897 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.033192 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.033176518 podStartE2EDuration="15.033176518s" podCreationTimestamp="2026-02-17 14:56:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:56:50.032278791 +0000 UTC m=+276.448119287" watchObservedRunningTime="2026-02-17 14:56:50.033176518 +0000 UTC m=+276.449016994" Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.098993 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.357352 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.358129 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.360709 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.384463 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.457423 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.482693 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.509852 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.643402 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 14:56:50 crc kubenswrapper[4717]: I0217 14:56:50.761253 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:50.827633 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:50.922812 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:50.987295 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.004724 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.029507 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.029867 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.037893 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.072759 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.075148 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.097535 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.108945 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.127433 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.138692 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.191756 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.215036 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.360052 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.413478 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.453168 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.457590 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.577035 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.676404 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.744325 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.778603 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.781322 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.873411 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.882222 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee77ab9-8247-4813-8137-d627a31c8840" path="/var/lib/kubelet/pods/eee77ab9-8247-4813-8137-d627a31c8840/volumes" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.907651 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 14:56:51 crc kubenswrapper[4717]: I0217 14:56:51.952802 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.183316 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.329413 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.377311 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.400741 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.433023 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.629452 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.630576 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.675925 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.704353 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.748602 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.751866 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.829253 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.835703 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.848370 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.942580 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 14:56:52 crc kubenswrapper[4717]: I0217 14:56:52.969155 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.012200 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.057411 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.081360 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.095235 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.098728 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.300757 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.306622 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.347861 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.383398 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.394165 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.447330 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.471392 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.486916 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.502826 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.505226 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.508030 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.572824 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.610821 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.620414 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.636416 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.674269 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.815889 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.914900 4717 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podeee77ab9-8247-4813-8137-d627a31c8840"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podeee77ab9-8247-4813-8137-d627a31c8840] : Timed out while waiting for systemd to remove kubepods-burstable-podeee77ab9_8247_4813_8137_d627a31c8840.slice" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.938416 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.952838 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 14:56:53 crc kubenswrapper[4717]: I0217 14:56:53.982846 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.017713 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.033996 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.072976 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.099910 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.105695 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.114703 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.310815 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.367494 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.385462 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.486115 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.564398 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.589212 4717 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.639814 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.653129 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.724245 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.736466 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 14:56:54 crc kubenswrapper[4717]: I0217 14:56:54.930126 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.068468 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.071243 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7fb5d9b995-9brwh"] Feb 17 14:56:55 crc kubenswrapper[4717]: E0217 14:56:55.071627 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee77ab9-8247-4813-8137-d627a31c8840" containerName="oauth-openshift" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.071674 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee77ab9-8247-4813-8137-d627a31c8840" containerName="oauth-openshift" Feb 17 14:56:55 crc kubenswrapper[4717]: E0217 14:56:55.071738 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" containerName="installer" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.071757 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" containerName="installer" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.071973 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee77ab9-8247-4813-8137-d627a31c8840" containerName="oauth-openshift" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.072021 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3f83fb-6861-461b-a7de-5968ce2089fd" containerName="installer" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.073023 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.077421 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.077722 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.077868 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.078589 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.078948 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.079118 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.079267 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.079856 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.081634 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.083252 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.083289 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.085415 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.097327 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.098923 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.100371 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.105437 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.141711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsbw5\" (UniqueName: \"kubernetes.io/projected/129543eb-8d64-4b8b-9dec-ca1be2345b4f-kube-api-access-wsbw5\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.141786 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.141839 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-session\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.141871 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.141909 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-user-template-login\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.142160 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/129543eb-8d64-4b8b-9dec-ca1be2345b4f-audit-dir\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.142243 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-user-template-error\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.142329 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/129543eb-8d64-4b8b-9dec-ca1be2345b4f-audit-policies\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.142423 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.142485 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.142577 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.142617 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.142653 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.142685 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.243757 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.243831 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.243855 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.243880 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.243913 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsbw5\" (UniqueName: \"kubernetes.io/projected/129543eb-8d64-4b8b-9dec-ca1be2345b4f-kube-api-access-wsbw5\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.243938 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.243962 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-session\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.243980 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.244005 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-user-template-login\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.244038 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/129543eb-8d64-4b8b-9dec-ca1be2345b4f-audit-dir\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.244064 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-user-template-error\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.244109 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/129543eb-8d64-4b8b-9dec-ca1be2345b4f-audit-policies\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.244146 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.244169 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.244455 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/129543eb-8d64-4b8b-9dec-ca1be2345b4f-audit-dir\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.245393 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.245398 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/129543eb-8d64-4b8b-9dec-ca1be2345b4f-audit-policies\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.245968 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.245969 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.252786 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.252818 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-user-template-error\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.253289 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-user-template-login\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.253312 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.254511 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.254920 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-session\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.256453 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.257562 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/129543eb-8d64-4b8b-9dec-ca1be2345b4f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.263902 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsbw5\" (UniqueName: \"kubernetes.io/projected/129543eb-8d64-4b8b-9dec-ca1be2345b4f-kube-api-access-wsbw5\") pod \"oauth-openshift-7fb5d9b995-9brwh\" (UID: \"129543eb-8d64-4b8b-9dec-ca1be2345b4f\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.279549 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.344497 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.379033 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.386708 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.392439 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.398063 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.518360 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.541301 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.592385 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.647365 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.648201 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.798965 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.854057 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.864264 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.886704 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.946124 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 14:56:55 crc kubenswrapper[4717]: I0217 14:56:55.981852 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.015447 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.029330 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.085125 4717 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.090348 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.134819 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.278510 4717 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.378759 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.394416 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.407047 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.416630 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.426725 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.433743 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.464852 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.490677 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.554990 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.615163 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.629870 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.638494 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.703483 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.707151 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.773439 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 14:56:56 crc kubenswrapper[4717]: I0217 14:56:56.920007 4717 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.084060 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.098187 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.148329 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.148830 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.168505 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.212711 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.258426 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.344761 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.361307 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.367952 4717 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.642016 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.751141 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.829610 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.834709 4717 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.835145 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://bade48f91e1f7170476b32ad9dc36444531f6397bb27f763413016f5c2bb217c" gracePeriod=5 Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.858130 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.899579 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.958577 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 14:56:57 crc kubenswrapper[4717]: I0217 14:56:57.978948 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 14:56:58 crc kubenswrapper[4717]: I0217 14:56:58.000233 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 14:56:58 crc kubenswrapper[4717]: I0217 14:56:58.249148 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 14:56:58 crc kubenswrapper[4717]: I0217 14:56:58.289203 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7fb5d9b995-9brwh"] Feb 17 14:56:58 crc kubenswrapper[4717]: I0217 14:56:58.477168 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7fb5d9b995-9brwh"] Feb 17 14:56:58 crc kubenswrapper[4717]: I0217 14:56:58.542684 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 14:56:58 crc kubenswrapper[4717]: I0217 14:56:58.604146 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 14:56:58 crc kubenswrapper[4717]: I0217 14:56:58.604874 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 14:56:58 crc kubenswrapper[4717]: I0217 14:56:58.788415 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 14:56:58 crc kubenswrapper[4717]: I0217 14:56:58.792144 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 14:56:58 crc kubenswrapper[4717]: I0217 14:56:58.836035 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 14:56:58 crc kubenswrapper[4717]: I0217 14:56:58.908063 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 14:56:58 crc kubenswrapper[4717]: I0217 14:56:58.948869 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.079436 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.113476 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.162335 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" event={"ID":"129543eb-8d64-4b8b-9dec-ca1be2345b4f","Type":"ContainerStarted","Data":"75aef448b6cdf5b68f842ff002f975f140e28071a7ac48c8370d3ef55b6e5519"} Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.162772 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.162804 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" event={"ID":"129543eb-8d64-4b8b-9dec-ca1be2345b4f","Type":"ContainerStarted","Data":"4780ca4cb719d992562825f1fc13cce2864f9fbc759b60540e8e49b455f0f7aa"} Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.188350 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.191444 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" podStartSLOduration=61.191418114 podStartE2EDuration="1m1.191418114s" podCreationTimestamp="2026-02-17 14:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:56:59.188221609 +0000 UTC m=+285.604062125" watchObservedRunningTime="2026-02-17 14:56:59.191418114 +0000 UTC m=+285.607258620" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.341105 4717 patch_prober.go:28] interesting pod/oauth-openshift-7fb5d9b995-9brwh container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:46358->10.217.0.56:6443: read: connection reset by peer" start-of-body= Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.341192 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" podUID="129543eb-8d64-4b8b-9dec-ca1be2345b4f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:46358->10.217.0.56:6443: read: connection reset by peer" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.370945 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.400111 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.416626 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.428958 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.434477 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.586066 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.588398 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.591245 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 14:56:59 crc kubenswrapper[4717]: I0217 14:56:59.927403 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.084775 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.136730 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.136917 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.175488 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.178143 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.178461 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-7fb5d9b995-9brwh_129543eb-8d64-4b8b-9dec-ca1be2345b4f/oauth-openshift/0.log" Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.178516 4717 generic.go:334] "Generic (PLEG): container finished" podID="129543eb-8d64-4b8b-9dec-ca1be2345b4f" containerID="75aef448b6cdf5b68f842ff002f975f140e28071a7ac48c8370d3ef55b6e5519" exitCode=255 Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.178576 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" event={"ID":"129543eb-8d64-4b8b-9dec-ca1be2345b4f","Type":"ContainerDied","Data":"75aef448b6cdf5b68f842ff002f975f140e28071a7ac48c8370d3ef55b6e5519"} Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.179398 4717 scope.go:117] "RemoveContainer" containerID="75aef448b6cdf5b68f842ff002f975f140e28071a7ac48c8370d3ef55b6e5519" Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.469273 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.641044 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.649350 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.716210 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 14:57:00 crc kubenswrapper[4717]: I0217 14:57:00.956639 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 14:57:01 crc kubenswrapper[4717]: I0217 14:57:01.174896 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 14:57:01 crc kubenswrapper[4717]: I0217 14:57:01.188168 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-7fb5d9b995-9brwh_129543eb-8d64-4b8b-9dec-ca1be2345b4f/oauth-openshift/0.log" Feb 17 14:57:01 crc kubenswrapper[4717]: I0217 14:57:01.188254 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" event={"ID":"129543eb-8d64-4b8b-9dec-ca1be2345b4f","Type":"ContainerStarted","Data":"27a387dfa1af5150d355426a6739513d9ec52eb3aa0536850251766bed9888e1"} Feb 17 14:57:01 crc kubenswrapper[4717]: I0217 14:57:01.188710 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:57:01 crc kubenswrapper[4717]: I0217 14:57:01.194816 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7fb5d9b995-9brwh" Feb 17 14:57:02 crc kubenswrapper[4717]: I0217 14:57:02.152186 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.216211 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.216577 4717 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="bade48f91e1f7170476b32ad9dc36444531f6397bb27f763413016f5c2bb217c" exitCode=137 Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.444152 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.540890 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.541051 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.674948 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.675124 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.675173 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.675153 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.675313 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.675322 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.675353 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.675353 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.675385 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.676193 4717 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.676241 4717 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.676266 4717 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.676288 4717 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.690441 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.776990 4717 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:57:03 crc kubenswrapper[4717]: I0217 14:57:03.856341 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 17 14:57:04 crc kubenswrapper[4717]: I0217 14:57:04.227309 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 14:57:04 crc kubenswrapper[4717]: I0217 14:57:04.227457 4717 scope.go:117] "RemoveContainer" containerID="bade48f91e1f7170476b32ad9dc36444531f6397bb27f763413016f5c2bb217c" Feb 17 14:57:04 crc kubenswrapper[4717]: I0217 14:57:04.227544 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:57:15 crc kubenswrapper[4717]: I0217 14:57:15.375358 4717 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 17 14:57:18 crc kubenswrapper[4717]: I0217 14:57:18.314046 4717 generic.go:334] "Generic (PLEG): container finished" podID="2fbfa975-4e24-4213-8368-ba1af6b39e21" containerID="6dca40a0f9057be15062a06f09179c08d0369bc7da7a166c9c75fa49c718c09a" exitCode=0 Feb 17 14:57:18 crc kubenswrapper[4717]: I0217 14:57:18.314185 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" event={"ID":"2fbfa975-4e24-4213-8368-ba1af6b39e21","Type":"ContainerDied","Data":"6dca40a0f9057be15062a06f09179c08d0369bc7da7a166c9c75fa49c718c09a"} Feb 17 14:57:18 crc kubenswrapper[4717]: I0217 14:57:18.315701 4717 scope.go:117] "RemoveContainer" containerID="6dca40a0f9057be15062a06f09179c08d0369bc7da7a166c9c75fa49c718c09a" Feb 17 14:57:19 crc kubenswrapper[4717]: I0217 14:57:19.321770 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" event={"ID":"2fbfa975-4e24-4213-8368-ba1af6b39e21","Type":"ContainerStarted","Data":"316e0e26038483307357c997ce468ecf66894792a94a9d08310f6b194d6ba6db"} Feb 17 14:57:19 crc kubenswrapper[4717]: I0217 14:57:19.322126 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:57:19 crc kubenswrapper[4717]: I0217 14:57:19.323526 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:58:04 crc kubenswrapper[4717]: I0217 14:58:04.366030 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nfj6t"] Feb 17 14:58:04 crc kubenswrapper[4717]: I0217 14:58:04.366847 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" podUID="45c674d2-b259-4f0d-a92f-5a906aa07392" containerName="controller-manager" containerID="cri-o://e0991f3cae46b3aab025eb65a4752385a7cef4b57ce19e0b8f93d06e86edde98" gracePeriod=30 Feb 17 14:58:04 crc kubenswrapper[4717]: I0217 14:58:04.498248 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh"] Feb 17 14:58:04 crc kubenswrapper[4717]: I0217 14:58:04.498546 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" podUID="35e2909f-4827-4ad7-a8f2-b464f982967f" containerName="route-controller-manager" containerID="cri-o://27eeee204c2feff5387bb12477bccdf978d6ea7fcd7d38221dbfe4cff76fc4d1" gracePeriod=30 Feb 17 14:58:04 crc kubenswrapper[4717]: I0217 14:58:04.623323 4717 generic.go:334] "Generic (PLEG): container finished" podID="45c674d2-b259-4f0d-a92f-5a906aa07392" containerID="e0991f3cae46b3aab025eb65a4752385a7cef4b57ce19e0b8f93d06e86edde98" exitCode=0 Feb 17 14:58:04 crc kubenswrapper[4717]: I0217 14:58:04.623410 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" event={"ID":"45c674d2-b259-4f0d-a92f-5a906aa07392","Type":"ContainerDied","Data":"e0991f3cae46b3aab025eb65a4752385a7cef4b57ce19e0b8f93d06e86edde98"} Feb 17 14:58:04 crc kubenswrapper[4717]: I0217 14:58:04.626200 4717 generic.go:334] "Generic (PLEG): container finished" podID="35e2909f-4827-4ad7-a8f2-b464f982967f" containerID="27eeee204c2feff5387bb12477bccdf978d6ea7fcd7d38221dbfe4cff76fc4d1" exitCode=0 Feb 17 14:58:04 crc kubenswrapper[4717]: I0217 14:58:04.626247 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" event={"ID":"35e2909f-4827-4ad7-a8f2-b464f982967f","Type":"ContainerDied","Data":"27eeee204c2feff5387bb12477bccdf978d6ea7fcd7d38221dbfe4cff76fc4d1"} Feb 17 14:58:04 crc kubenswrapper[4717]: I0217 14:58:04.955236 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.009332 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.153972 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35e2909f-4827-4ad7-a8f2-b464f982967f-client-ca\") pod \"35e2909f-4827-4ad7-a8f2-b464f982967f\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.154032 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-proxy-ca-bundles\") pod \"45c674d2-b259-4f0d-a92f-5a906aa07392\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.154108 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll6d5\" (UniqueName: \"kubernetes.io/projected/45c674d2-b259-4f0d-a92f-5a906aa07392-kube-api-access-ll6d5\") pod \"45c674d2-b259-4f0d-a92f-5a906aa07392\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.154173 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e2909f-4827-4ad7-a8f2-b464f982967f-config\") pod \"35e2909f-4827-4ad7-a8f2-b464f982967f\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.154200 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35e2909f-4827-4ad7-a8f2-b464f982967f-serving-cert\") pod \"35e2909f-4827-4ad7-a8f2-b464f982967f\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.154236 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-client-ca\") pod \"45c674d2-b259-4f0d-a92f-5a906aa07392\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.154276 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45c674d2-b259-4f0d-a92f-5a906aa07392-serving-cert\") pod \"45c674d2-b259-4f0d-a92f-5a906aa07392\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.154333 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxqdd\" (UniqueName: \"kubernetes.io/projected/35e2909f-4827-4ad7-a8f2-b464f982967f-kube-api-access-cxqdd\") pod \"35e2909f-4827-4ad7-a8f2-b464f982967f\" (UID: \"35e2909f-4827-4ad7-a8f2-b464f982967f\") " Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.154362 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-config\") pod \"45c674d2-b259-4f0d-a92f-5a906aa07392\" (UID: \"45c674d2-b259-4f0d-a92f-5a906aa07392\") " Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.155009 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35e2909f-4827-4ad7-a8f2-b464f982967f-client-ca" (OuterVolumeSpecName: "client-ca") pod "35e2909f-4827-4ad7-a8f2-b464f982967f" (UID: "35e2909f-4827-4ad7-a8f2-b464f982967f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.155404 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-config" (OuterVolumeSpecName: "config") pod "45c674d2-b259-4f0d-a92f-5a906aa07392" (UID: "45c674d2-b259-4f0d-a92f-5a906aa07392"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.155435 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-client-ca" (OuterVolumeSpecName: "client-ca") pod "45c674d2-b259-4f0d-a92f-5a906aa07392" (UID: "45c674d2-b259-4f0d-a92f-5a906aa07392"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.155749 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "45c674d2-b259-4f0d-a92f-5a906aa07392" (UID: "45c674d2-b259-4f0d-a92f-5a906aa07392"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.155898 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35e2909f-4827-4ad7-a8f2-b464f982967f-config" (OuterVolumeSpecName: "config") pod "35e2909f-4827-4ad7-a8f2-b464f982967f" (UID: "35e2909f-4827-4ad7-a8f2-b464f982967f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.161860 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45c674d2-b259-4f0d-a92f-5a906aa07392-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "45c674d2-b259-4f0d-a92f-5a906aa07392" (UID: "45c674d2-b259-4f0d-a92f-5a906aa07392"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.165544 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e2909f-4827-4ad7-a8f2-b464f982967f-kube-api-access-cxqdd" (OuterVolumeSpecName: "kube-api-access-cxqdd") pod "35e2909f-4827-4ad7-a8f2-b464f982967f" (UID: "35e2909f-4827-4ad7-a8f2-b464f982967f"). InnerVolumeSpecName "kube-api-access-cxqdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.165585 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e2909f-4827-4ad7-a8f2-b464f982967f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "35e2909f-4827-4ad7-a8f2-b464f982967f" (UID: "35e2909f-4827-4ad7-a8f2-b464f982967f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.167410 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c674d2-b259-4f0d-a92f-5a906aa07392-kube-api-access-ll6d5" (OuterVolumeSpecName: "kube-api-access-ll6d5") pod "45c674d2-b259-4f0d-a92f-5a906aa07392" (UID: "45c674d2-b259-4f0d-a92f-5a906aa07392"). InnerVolumeSpecName "kube-api-access-ll6d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.240811 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8549455d47-2jl9d"] Feb 17 14:58:05 crc kubenswrapper[4717]: E0217 14:58:05.241108 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c674d2-b259-4f0d-a92f-5a906aa07392" containerName="controller-manager" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.241123 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c674d2-b259-4f0d-a92f-5a906aa07392" containerName="controller-manager" Feb 17 14:58:05 crc kubenswrapper[4717]: E0217 14:58:05.241138 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e2909f-4827-4ad7-a8f2-b464f982967f" containerName="route-controller-manager" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.241145 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e2909f-4827-4ad7-a8f2-b464f982967f" containerName="route-controller-manager" Feb 17 14:58:05 crc kubenswrapper[4717]: E0217 14:58:05.241156 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.241164 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.241453 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e2909f-4827-4ad7-a8f2-b464f982967f" containerName="route-controller-manager" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.241469 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.241478 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="45c674d2-b259-4f0d-a92f-5a906aa07392" containerName="controller-manager" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.241911 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.255783 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-config\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.255882 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-client-ca\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.255922 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmhkh\" (UniqueName: \"kubernetes.io/projected/cce66dd9-aa57-4f43-b985-fa8ebf685e14-kube-api-access-bmhkh\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.255964 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce66dd9-aa57-4f43-b985-fa8ebf685e14-serving-cert\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.255983 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-proxy-ca-bundles\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.256025 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxqdd\" (UniqueName: \"kubernetes.io/projected/35e2909f-4827-4ad7-a8f2-b464f982967f-kube-api-access-cxqdd\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.256037 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.256048 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35e2909f-4827-4ad7-a8f2-b464f982967f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.256058 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.256069 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll6d5\" (UniqueName: \"kubernetes.io/projected/45c674d2-b259-4f0d-a92f-5a906aa07392-kube-api-access-ll6d5\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.256091 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e2909f-4827-4ad7-a8f2-b464f982967f-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.256100 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35e2909f-4827-4ad7-a8f2-b464f982967f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.256109 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45c674d2-b259-4f0d-a92f-5a906aa07392-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.256117 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45c674d2-b259-4f0d-a92f-5a906aa07392-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.259862 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8549455d47-2jl9d"] Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.319869 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj"] Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.320791 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.335269 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj"] Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.356826 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/431dbb93-53b1-4c14-9c62-9371f3eec0f8-client-ca\") pod \"route-controller-manager-b7bf6dcb7-kz5mj\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.356872 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-client-ca\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.356897 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmhkh\" (UniqueName: \"kubernetes.io/projected/cce66dd9-aa57-4f43-b985-fa8ebf685e14-kube-api-access-bmhkh\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.356917 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/431dbb93-53b1-4c14-9c62-9371f3eec0f8-serving-cert\") pod \"route-controller-manager-b7bf6dcb7-kz5mj\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.356934 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgk7h\" (UniqueName: \"kubernetes.io/projected/431dbb93-53b1-4c14-9c62-9371f3eec0f8-kube-api-access-jgk7h\") pod \"route-controller-manager-b7bf6dcb7-kz5mj\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.356953 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/431dbb93-53b1-4c14-9c62-9371f3eec0f8-config\") pod \"route-controller-manager-b7bf6dcb7-kz5mj\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.356979 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce66dd9-aa57-4f43-b985-fa8ebf685e14-serving-cert\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.356997 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-proxy-ca-bundles\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.357039 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-config\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.358211 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-client-ca\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.358348 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-config\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.358665 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-proxy-ca-bundles\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.364051 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce66dd9-aa57-4f43-b985-fa8ebf685e14-serving-cert\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.379153 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmhkh\" (UniqueName: \"kubernetes.io/projected/cce66dd9-aa57-4f43-b985-fa8ebf685e14-kube-api-access-bmhkh\") pod \"controller-manager-8549455d47-2jl9d\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.458282 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/431dbb93-53b1-4c14-9c62-9371f3eec0f8-serving-cert\") pod \"route-controller-manager-b7bf6dcb7-kz5mj\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.458350 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgk7h\" (UniqueName: \"kubernetes.io/projected/431dbb93-53b1-4c14-9c62-9371f3eec0f8-kube-api-access-jgk7h\") pod \"route-controller-manager-b7bf6dcb7-kz5mj\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.458388 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/431dbb93-53b1-4c14-9c62-9371f3eec0f8-config\") pod \"route-controller-manager-b7bf6dcb7-kz5mj\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.458485 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/431dbb93-53b1-4c14-9c62-9371f3eec0f8-client-ca\") pod \"route-controller-manager-b7bf6dcb7-kz5mj\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.459637 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/431dbb93-53b1-4c14-9c62-9371f3eec0f8-client-ca\") pod \"route-controller-manager-b7bf6dcb7-kz5mj\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.461847 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/431dbb93-53b1-4c14-9c62-9371f3eec0f8-serving-cert\") pod \"route-controller-manager-b7bf6dcb7-kz5mj\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.462175 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/431dbb93-53b1-4c14-9c62-9371f3eec0f8-config\") pod \"route-controller-manager-b7bf6dcb7-kz5mj\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.475886 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgk7h\" (UniqueName: \"kubernetes.io/projected/431dbb93-53b1-4c14-9c62-9371f3eec0f8-kube-api-access-jgk7h\") pod \"route-controller-manager-b7bf6dcb7-kz5mj\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.555876 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.635472 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" event={"ID":"45c674d2-b259-4f0d-a92f-5a906aa07392","Type":"ContainerDied","Data":"6da73034eafcd6e21c5a32224b35d2b47c1187f840ebb4ac7596220c38865ee1"} Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.635493 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.635522 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nfj6t" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.635536 4717 scope.go:117] "RemoveContainer" containerID="e0991f3cae46b3aab025eb65a4752385a7cef4b57ce19e0b8f93d06e86edde98" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.638381 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" event={"ID":"35e2909f-4827-4ad7-a8f2-b464f982967f","Type":"ContainerDied","Data":"938672057d00dff7db8c5111a36df1c25beba6288838ff74fb86d77cc988153a"} Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.638714 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.658303 4717 scope.go:117] "RemoveContainer" containerID="27eeee204c2feff5387bb12477bccdf978d6ea7fcd7d38221dbfe4cff76fc4d1" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.698549 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nfj6t"] Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.703434 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nfj6t"] Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.728987 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh"] Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.732668 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s8hhh"] Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.821604 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8549455d47-2jl9d"] Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.853587 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35e2909f-4827-4ad7-a8f2-b464f982967f" path="/var/lib/kubelet/pods/35e2909f-4827-4ad7-a8f2-b464f982967f/volumes" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.854181 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45c674d2-b259-4f0d-a92f-5a906aa07392" path="/var/lib/kubelet/pods/45c674d2-b259-4f0d-a92f-5a906aa07392/volumes" Feb 17 14:58:05 crc kubenswrapper[4717]: I0217 14:58:05.916012 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj"] Feb 17 14:58:05 crc kubenswrapper[4717]: W0217 14:58:05.938997 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod431dbb93_53b1_4c14_9c62_9371f3eec0f8.slice/crio-f56fb88d3cde37f8908fed62e3e3c005838612d2ef0ba8ea3983498dc99abd41 WatchSource:0}: Error finding container f56fb88d3cde37f8908fed62e3e3c005838612d2ef0ba8ea3983498dc99abd41: Status 404 returned error can't find the container with id f56fb88d3cde37f8908fed62e3e3c005838612d2ef0ba8ea3983498dc99abd41 Feb 17 14:58:06 crc kubenswrapper[4717]: I0217 14:58:06.645757 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" event={"ID":"431dbb93-53b1-4c14-9c62-9371f3eec0f8","Type":"ContainerStarted","Data":"879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b"} Feb 17 14:58:06 crc kubenswrapper[4717]: I0217 14:58:06.646194 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:06 crc kubenswrapper[4717]: I0217 14:58:06.646216 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" event={"ID":"431dbb93-53b1-4c14-9c62-9371f3eec0f8","Type":"ContainerStarted","Data":"f56fb88d3cde37f8908fed62e3e3c005838612d2ef0ba8ea3983498dc99abd41"} Feb 17 14:58:06 crc kubenswrapper[4717]: I0217 14:58:06.649258 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" event={"ID":"cce66dd9-aa57-4f43-b985-fa8ebf685e14","Type":"ContainerStarted","Data":"a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30"} Feb 17 14:58:06 crc kubenswrapper[4717]: I0217 14:58:06.649296 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" event={"ID":"cce66dd9-aa57-4f43-b985-fa8ebf685e14","Type":"ContainerStarted","Data":"53a97555c10f2b3c37b088cdf955c19736c0c63447bf8662c7a7b4560fc53dcb"} Feb 17 14:58:06 crc kubenswrapper[4717]: I0217 14:58:06.649462 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:06 crc kubenswrapper[4717]: I0217 14:58:06.652347 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:06 crc kubenswrapper[4717]: I0217 14:58:06.655569 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:06 crc kubenswrapper[4717]: I0217 14:58:06.665219 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" podStartSLOduration=1.665197333 podStartE2EDuration="1.665197333s" podCreationTimestamp="2026-02-17 14:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:58:06.663406556 +0000 UTC m=+353.079247032" watchObservedRunningTime="2026-02-17 14:58:06.665197333 +0000 UTC m=+353.081037809" Feb 17 14:58:06 crc kubenswrapper[4717]: I0217 14:58:06.684278 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" podStartSLOduration=1.684253212 podStartE2EDuration="1.684253212s" podCreationTimestamp="2026-02-17 14:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:58:06.679142238 +0000 UTC m=+353.094982724" watchObservedRunningTime="2026-02-17 14:58:06.684253212 +0000 UTC m=+353.100093698" Feb 17 14:58:17 crc kubenswrapper[4717]: I0217 14:58:17.799529 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8549455d47-2jl9d"] Feb 17 14:58:17 crc kubenswrapper[4717]: I0217 14:58:17.800654 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" podUID="cce66dd9-aa57-4f43-b985-fa8ebf685e14" containerName="controller-manager" containerID="cri-o://a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30" gracePeriod=30 Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.504775 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.657549 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-config\") pod \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.657656 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmhkh\" (UniqueName: \"kubernetes.io/projected/cce66dd9-aa57-4f43-b985-fa8ebf685e14-kube-api-access-bmhkh\") pod \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.657790 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-proxy-ca-bundles\") pod \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.657865 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-client-ca\") pod \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.657898 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce66dd9-aa57-4f43-b985-fa8ebf685e14-serving-cert\") pod \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\" (UID: \"cce66dd9-aa57-4f43-b985-fa8ebf685e14\") " Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.658972 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-config" (OuterVolumeSpecName: "config") pod "cce66dd9-aa57-4f43-b985-fa8ebf685e14" (UID: "cce66dd9-aa57-4f43-b985-fa8ebf685e14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.659038 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-client-ca" (OuterVolumeSpecName: "client-ca") pod "cce66dd9-aa57-4f43-b985-fa8ebf685e14" (UID: "cce66dd9-aa57-4f43-b985-fa8ebf685e14"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.659028 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cce66dd9-aa57-4f43-b985-fa8ebf685e14" (UID: "cce66dd9-aa57-4f43-b985-fa8ebf685e14"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.665279 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce66dd9-aa57-4f43-b985-fa8ebf685e14-kube-api-access-bmhkh" (OuterVolumeSpecName: "kube-api-access-bmhkh") pod "cce66dd9-aa57-4f43-b985-fa8ebf685e14" (UID: "cce66dd9-aa57-4f43-b985-fa8ebf685e14"). InnerVolumeSpecName "kube-api-access-bmhkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.667239 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce66dd9-aa57-4f43-b985-fa8ebf685e14-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cce66dd9-aa57-4f43-b985-fa8ebf685e14" (UID: "cce66dd9-aa57-4f43-b985-fa8ebf685e14"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.756528 4717 generic.go:334] "Generic (PLEG): container finished" podID="cce66dd9-aa57-4f43-b985-fa8ebf685e14" containerID="a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30" exitCode=0 Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.756614 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" event={"ID":"cce66dd9-aa57-4f43-b985-fa8ebf685e14","Type":"ContainerDied","Data":"a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30"} Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.756632 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.756676 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8549455d47-2jl9d" event={"ID":"cce66dd9-aa57-4f43-b985-fa8ebf685e14","Type":"ContainerDied","Data":"53a97555c10f2b3c37b088cdf955c19736c0c63447bf8662c7a7b4560fc53dcb"} Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.756706 4717 scope.go:117] "RemoveContainer" containerID="a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.759351 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.759406 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.759418 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce66dd9-aa57-4f43-b985-fa8ebf685e14-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.759433 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce66dd9-aa57-4f43-b985-fa8ebf685e14-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.759447 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmhkh\" (UniqueName: \"kubernetes.io/projected/cce66dd9-aa57-4f43-b985-fa8ebf685e14-kube-api-access-bmhkh\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.776218 4717 scope.go:117] "RemoveContainer" containerID="a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30" Feb 17 14:58:18 crc kubenswrapper[4717]: E0217 14:58:18.777674 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30\": container with ID starting with a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30 not found: ID does not exist" containerID="a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.777727 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30"} err="failed to get container status \"a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30\": rpc error: code = NotFound desc = could not find container \"a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30\": container with ID starting with a47a68815f265c14e7883095bb6be3d4c25b5722842328ec708ba5fd54251c30 not found: ID does not exist" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.791225 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8549455d47-2jl9d"] Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.795420 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8549455d47-2jl9d"] Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.833792 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr"] Feb 17 14:58:18 crc kubenswrapper[4717]: E0217 14:58:18.834075 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce66dd9-aa57-4f43-b985-fa8ebf685e14" containerName="controller-manager" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.834110 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce66dd9-aa57-4f43-b985-fa8ebf685e14" containerName="controller-manager" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.834252 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce66dd9-aa57-4f43-b985-fa8ebf685e14" containerName="controller-manager" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.834741 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.837656 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.837944 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.839179 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.840350 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.840668 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.842188 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.848116 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr"] Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.850854 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.962878 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0b227e9-a36a-4c0b-8411-251ed91019f0-client-ca\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.962987 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0b227e9-a36a-4c0b-8411-251ed91019f0-proxy-ca-bundles\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.963040 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmqfv\" (UniqueName: \"kubernetes.io/projected/e0b227e9-a36a-4c0b-8411-251ed91019f0-kube-api-access-mmqfv\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.963063 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b227e9-a36a-4c0b-8411-251ed91019f0-serving-cert\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:18 crc kubenswrapper[4717]: I0217 14:58:18.963098 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0b227e9-a36a-4c0b-8411-251ed91019f0-config\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.064971 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0b227e9-a36a-4c0b-8411-251ed91019f0-client-ca\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.065121 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0b227e9-a36a-4c0b-8411-251ed91019f0-proxy-ca-bundles\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.065171 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmqfv\" (UniqueName: \"kubernetes.io/projected/e0b227e9-a36a-4c0b-8411-251ed91019f0-kube-api-access-mmqfv\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.065198 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b227e9-a36a-4c0b-8411-251ed91019f0-serving-cert\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.065225 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0b227e9-a36a-4c0b-8411-251ed91019f0-config\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.066351 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0b227e9-a36a-4c0b-8411-251ed91019f0-client-ca\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.066992 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0b227e9-a36a-4c0b-8411-251ed91019f0-config\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.067132 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0b227e9-a36a-4c0b-8411-251ed91019f0-proxy-ca-bundles\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.075535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b227e9-a36a-4c0b-8411-251ed91019f0-serving-cert\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.088413 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmqfv\" (UniqueName: \"kubernetes.io/projected/e0b227e9-a36a-4c0b-8411-251ed91019f0-kube-api-access-mmqfv\") pod \"controller-manager-fccdd5f7d-tvhvr\" (UID: \"e0b227e9-a36a-4c0b-8411-251ed91019f0\") " pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.159736 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.402809 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr"] Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.770743 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" event={"ID":"e0b227e9-a36a-4c0b-8411-251ed91019f0","Type":"ContainerStarted","Data":"56512fc3571675c25ee905fe53cc5d49640fc46a887223efe131656fc7e39e01"} Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.771192 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" event={"ID":"e0b227e9-a36a-4c0b-8411-251ed91019f0","Type":"ContainerStarted","Data":"f71e79402a9470ec7068d7a7347c5b77f933f39132df0b94ff0291b0d9aba79e"} Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.771341 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.777918 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.806243 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-fccdd5f7d-tvhvr" podStartSLOduration=2.8062049719999997 podStartE2EDuration="2.806204972s" podCreationTimestamp="2026-02-17 14:58:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:58:19.80576535 +0000 UTC m=+366.221605836" watchObservedRunningTime="2026-02-17 14:58:19.806204972 +0000 UTC m=+366.222045448" Feb 17 14:58:19 crc kubenswrapper[4717]: I0217 14:58:19.855101 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cce66dd9-aa57-4f43-b985-fa8ebf685e14" path="/var/lib/kubelet/pods/cce66dd9-aa57-4f43-b985-fa8ebf685e14/volumes" Feb 17 14:58:20 crc kubenswrapper[4717]: I0217 14:58:20.809616 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:58:20 crc kubenswrapper[4717]: I0217 14:58:20.809725 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.302827 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wkkwn"] Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.305066 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.319817 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wkkwn"] Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.440221 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.440287 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-registry-certificates\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.440317 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-registry-tls\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.440350 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.440478 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.440535 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-bound-sa-token\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.440568 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-trusted-ca\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.440596 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfl29\" (UniqueName: \"kubernetes.io/projected/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-kube-api-access-tfl29\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.483887 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.541750 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-registry-tls\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.541826 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.541851 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-bound-sa-token\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.541875 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-trusted-ca\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.541901 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfl29\" (UniqueName: \"kubernetes.io/projected/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-kube-api-access-tfl29\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.541940 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.541975 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-registry-certificates\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.542548 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.543291 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-registry-certificates\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.543803 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-trusted-ca\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.550021 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.554759 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-registry-tls\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.559851 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfl29\" (UniqueName: \"kubernetes.io/projected/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-kube-api-access-tfl29\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.563182 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5-bound-sa-token\") pod \"image-registry-66df7c8f76-wkkwn\" (UID: \"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5\") " pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:36 crc kubenswrapper[4717]: I0217 14:58:36.628670 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:37 crc kubenswrapper[4717]: I0217 14:58:37.063845 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wkkwn"] Feb 17 14:58:37 crc kubenswrapper[4717]: I0217 14:58:37.126232 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" event={"ID":"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5","Type":"ContainerStarted","Data":"d53298ab1574ab5c4c3bfe61270e42cd23b381fd74438d2ec29910fa577d9a01"} Feb 17 14:58:38 crc kubenswrapper[4717]: I0217 14:58:38.177821 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" event={"ID":"3be7d522-2aa2-4e49-ae55-ec9e6a2ab9e5","Type":"ContainerStarted","Data":"b98b2c341a37c2e4b97b109c4f85ee562917db59ad55fbebcdf85496408c1868"} Feb 17 14:58:38 crc kubenswrapper[4717]: I0217 14:58:38.180604 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:38 crc kubenswrapper[4717]: I0217 14:58:38.202969 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" podStartSLOduration=2.202934425 podStartE2EDuration="2.202934425s" podCreationTimestamp="2026-02-17 14:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:58:38.199478405 +0000 UTC m=+384.615318891" watchObservedRunningTime="2026-02-17 14:58:38.202934425 +0000 UTC m=+384.618774921" Feb 17 14:58:44 crc kubenswrapper[4717]: I0217 14:58:44.377409 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj"] Feb 17 14:58:44 crc kubenswrapper[4717]: I0217 14:58:44.379308 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" podUID="431dbb93-53b1-4c14-9c62-9371f3eec0f8" containerName="route-controller-manager" containerID="cri-o://879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b" gracePeriod=30 Feb 17 14:58:44 crc kubenswrapper[4717]: I0217 14:58:44.873980 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:44 crc kubenswrapper[4717]: I0217 14:58:44.979973 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/431dbb93-53b1-4c14-9c62-9371f3eec0f8-client-ca\") pod \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " Feb 17 14:58:44 crc kubenswrapper[4717]: I0217 14:58:44.980044 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/431dbb93-53b1-4c14-9c62-9371f3eec0f8-serving-cert\") pod \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " Feb 17 14:58:44 crc kubenswrapper[4717]: I0217 14:58:44.980150 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/431dbb93-53b1-4c14-9c62-9371f3eec0f8-config\") pod \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " Feb 17 14:58:44 crc kubenswrapper[4717]: I0217 14:58:44.980209 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgk7h\" (UniqueName: \"kubernetes.io/projected/431dbb93-53b1-4c14-9c62-9371f3eec0f8-kube-api-access-jgk7h\") pod \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\" (UID: \"431dbb93-53b1-4c14-9c62-9371f3eec0f8\") " Feb 17 14:58:44 crc kubenswrapper[4717]: I0217 14:58:44.981169 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/431dbb93-53b1-4c14-9c62-9371f3eec0f8-config" (OuterVolumeSpecName: "config") pod "431dbb93-53b1-4c14-9c62-9371f3eec0f8" (UID: "431dbb93-53b1-4c14-9c62-9371f3eec0f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:58:44 crc kubenswrapper[4717]: I0217 14:58:44.981744 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/431dbb93-53b1-4c14-9c62-9371f3eec0f8-client-ca" (OuterVolumeSpecName: "client-ca") pod "431dbb93-53b1-4c14-9c62-9371f3eec0f8" (UID: "431dbb93-53b1-4c14-9c62-9371f3eec0f8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:58:44 crc kubenswrapper[4717]: I0217 14:58:44.988028 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/431dbb93-53b1-4c14-9c62-9371f3eec0f8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "431dbb93-53b1-4c14-9c62-9371f3eec0f8" (UID: "431dbb93-53b1-4c14-9c62-9371f3eec0f8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:58:44 crc kubenswrapper[4717]: I0217 14:58:44.988661 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/431dbb93-53b1-4c14-9c62-9371f3eec0f8-kube-api-access-jgk7h" (OuterVolumeSpecName: "kube-api-access-jgk7h") pod "431dbb93-53b1-4c14-9c62-9371f3eec0f8" (UID: "431dbb93-53b1-4c14-9c62-9371f3eec0f8"). InnerVolumeSpecName "kube-api-access-jgk7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.082604 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgk7h\" (UniqueName: \"kubernetes.io/projected/431dbb93-53b1-4c14-9c62-9371f3eec0f8-kube-api-access-jgk7h\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.082675 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/431dbb93-53b1-4c14-9c62-9371f3eec0f8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.082706 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/431dbb93-53b1-4c14-9c62-9371f3eec0f8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.082748 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/431dbb93-53b1-4c14-9c62-9371f3eec0f8-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.231641 4717 generic.go:334] "Generic (PLEG): container finished" podID="431dbb93-53b1-4c14-9c62-9371f3eec0f8" containerID="879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b" exitCode=0 Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.231739 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.231745 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" event={"ID":"431dbb93-53b1-4c14-9c62-9371f3eec0f8","Type":"ContainerDied","Data":"879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b"} Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.231973 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj" event={"ID":"431dbb93-53b1-4c14-9c62-9371f3eec0f8","Type":"ContainerDied","Data":"f56fb88d3cde37f8908fed62e3e3c005838612d2ef0ba8ea3983498dc99abd41"} Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.232019 4717 scope.go:117] "RemoveContainer" containerID="879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b" Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.267374 4717 scope.go:117] "RemoveContainer" containerID="879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b" Feb 17 14:58:45 crc kubenswrapper[4717]: E0217 14:58:45.268340 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b\": container with ID starting with 879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b not found: ID does not exist" containerID="879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b" Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.268409 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b"} err="failed to get container status \"879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b\": rpc error: code = NotFound desc = could not find container \"879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b\": container with ID starting with 879124fbf42c2f7ccfa995ed6f8506ee8bd0bf7b5e287926831aae86c3ba274b not found: ID does not exist" Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.295803 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj"] Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.303871 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bf6dcb7-kz5mj"] Feb 17 14:58:45 crc kubenswrapper[4717]: I0217 14:58:45.860559 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="431dbb93-53b1-4c14-9c62-9371f3eec0f8" path="/var/lib/kubelet/pods/431dbb93-53b1-4c14-9c62-9371f3eec0f8/volumes" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.175679 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw"] Feb 17 14:58:46 crc kubenswrapper[4717]: E0217 14:58:46.176160 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="431dbb93-53b1-4c14-9c62-9371f3eec0f8" containerName="route-controller-manager" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.176206 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="431dbb93-53b1-4c14-9c62-9371f3eec0f8" containerName="route-controller-manager" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.176395 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="431dbb93-53b1-4c14-9c62-9371f3eec0f8" containerName="route-controller-manager" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.179779 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.182400 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.183153 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.183183 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.183460 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.183515 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.185796 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.187850 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw"] Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.306803 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8c8d80-bb91-4689-bdf6-28de64e792b6-config\") pod \"route-controller-manager-5c8d996cfc-b5mfw\" (UID: \"2e8c8d80-bb91-4689-bdf6-28de64e792b6\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.306858 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e8c8d80-bb91-4689-bdf6-28de64e792b6-client-ca\") pod \"route-controller-manager-5c8d996cfc-b5mfw\" (UID: \"2e8c8d80-bb91-4689-bdf6-28de64e792b6\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.306911 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e8c8d80-bb91-4689-bdf6-28de64e792b6-serving-cert\") pod \"route-controller-manager-5c8d996cfc-b5mfw\" (UID: \"2e8c8d80-bb91-4689-bdf6-28de64e792b6\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.307182 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfk8t\" (UniqueName: \"kubernetes.io/projected/2e8c8d80-bb91-4689-bdf6-28de64e792b6-kube-api-access-pfk8t\") pod \"route-controller-manager-5c8d996cfc-b5mfw\" (UID: \"2e8c8d80-bb91-4689-bdf6-28de64e792b6\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.409742 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfk8t\" (UniqueName: \"kubernetes.io/projected/2e8c8d80-bb91-4689-bdf6-28de64e792b6-kube-api-access-pfk8t\") pod \"route-controller-manager-5c8d996cfc-b5mfw\" (UID: \"2e8c8d80-bb91-4689-bdf6-28de64e792b6\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.409890 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8c8d80-bb91-4689-bdf6-28de64e792b6-config\") pod \"route-controller-manager-5c8d996cfc-b5mfw\" (UID: \"2e8c8d80-bb91-4689-bdf6-28de64e792b6\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.409933 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e8c8d80-bb91-4689-bdf6-28de64e792b6-client-ca\") pod \"route-controller-manager-5c8d996cfc-b5mfw\" (UID: \"2e8c8d80-bb91-4689-bdf6-28de64e792b6\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.410028 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e8c8d80-bb91-4689-bdf6-28de64e792b6-serving-cert\") pod \"route-controller-manager-5c8d996cfc-b5mfw\" (UID: \"2e8c8d80-bb91-4689-bdf6-28de64e792b6\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.411837 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8c8d80-bb91-4689-bdf6-28de64e792b6-config\") pod \"route-controller-manager-5c8d996cfc-b5mfw\" (UID: \"2e8c8d80-bb91-4689-bdf6-28de64e792b6\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.412262 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e8c8d80-bb91-4689-bdf6-28de64e792b6-client-ca\") pod \"route-controller-manager-5c8d996cfc-b5mfw\" (UID: \"2e8c8d80-bb91-4689-bdf6-28de64e792b6\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.424204 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e8c8d80-bb91-4689-bdf6-28de64e792b6-serving-cert\") pod \"route-controller-manager-5c8d996cfc-b5mfw\" (UID: \"2e8c8d80-bb91-4689-bdf6-28de64e792b6\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.435703 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfk8t\" (UniqueName: \"kubernetes.io/projected/2e8c8d80-bb91-4689-bdf6-28de64e792b6-kube-api-access-pfk8t\") pod \"route-controller-manager-5c8d996cfc-b5mfw\" (UID: \"2e8c8d80-bb91-4689-bdf6-28de64e792b6\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.515287 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:46 crc kubenswrapper[4717]: I0217 14:58:46.994665 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw"] Feb 17 14:58:47 crc kubenswrapper[4717]: I0217 14:58:47.256619 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" event={"ID":"2e8c8d80-bb91-4689-bdf6-28de64e792b6","Type":"ContainerStarted","Data":"a25d7109ca5d2508d5675d0feda42ff11f912eb4c1147d7e70700239b4499ae2"} Feb 17 14:58:47 crc kubenswrapper[4717]: I0217 14:58:47.257166 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" event={"ID":"2e8c8d80-bb91-4689-bdf6-28de64e792b6","Type":"ContainerStarted","Data":"10d14693154cbedff9342c0a8054c9288860d69ceaa557542d1fd9249e504465"} Feb 17 14:58:47 crc kubenswrapper[4717]: I0217 14:58:47.257206 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:47 crc kubenswrapper[4717]: I0217 14:58:47.259519 4717 patch_prober.go:28] interesting pod/route-controller-manager-5c8d996cfc-b5mfw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Feb 17 14:58:47 crc kubenswrapper[4717]: I0217 14:58:47.259621 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" podUID="2e8c8d80-bb91-4689-bdf6-28de64e792b6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Feb 17 14:58:47 crc kubenswrapper[4717]: I0217 14:58:47.291390 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" podStartSLOduration=3.291358505 podStartE2EDuration="3.291358505s" podCreationTimestamp="2026-02-17 14:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:58:47.288397118 +0000 UTC m=+393.704237614" watchObservedRunningTime="2026-02-17 14:58:47.291358505 +0000 UTC m=+393.707199021" Feb 17 14:58:48 crc kubenswrapper[4717]: I0217 14:58:48.266283 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c8d996cfc-b5mfw" Feb 17 14:58:50 crc kubenswrapper[4717]: I0217 14:58:50.809209 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:58:50 crc kubenswrapper[4717]: I0217 14:58:50.810023 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.775176 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tbvbf"] Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.776752 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tbvbf" podUID="0bae9eeb-1b53-44fd-9751-7b03463ceaf9" containerName="registry-server" containerID="cri-o://060b9fdcf6947089e74c80da0d707e01f5e25987f36c129028236dea4049fed4" gracePeriod=30 Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.799188 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-49dn7"] Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.799667 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-49dn7" podUID="59890efa-3bd7-474e-b962-99c705159847" containerName="registry-server" containerID="cri-o://6d91d88403cab3e8ff653d0d0d873034df34c990b6022b519a5cec16d6230ce8" gracePeriod=30 Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.822671 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bxg9v"] Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.823035 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" podUID="2fbfa975-4e24-4213-8368-ba1af6b39e21" containerName="marketplace-operator" containerID="cri-o://316e0e26038483307357c997ce468ecf66894792a94a9d08310f6b194d6ba6db" gracePeriod=30 Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.839839 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2gw7"] Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.840330 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p2gw7" podUID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" containerName="registry-server" containerID="cri-o://c65edb7a4a9e69fd801c9e7a08ed2ee3ccd50bce7071b2c71049f23861ae672b" gracePeriod=30 Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.848695 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dsx5v"] Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.849000 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dsx5v" podUID="c61c5e17-781e-42f4-852d-0e0604721b86" containerName="registry-server" containerID="cri-o://a7d0f93f2303fa6c27c9343bf49fab182a1cfc956a3d3c1881a03811652bf863" gracePeriod=30 Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.865370 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-njsvg"] Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.866468 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:52 crc kubenswrapper[4717]: I0217 14:58:52.871532 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-njsvg"] Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.026965 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/813c7436-6b2f-45ed-8fc8-d400f00c80fd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-njsvg\" (UID: \"813c7436-6b2f-45ed-8fc8-d400f00c80fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.027158 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/813c7436-6b2f-45ed-8fc8-d400f00c80fd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-njsvg\" (UID: \"813c7436-6b2f-45ed-8fc8-d400f00c80fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.027300 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph62t\" (UniqueName: \"kubernetes.io/projected/813c7436-6b2f-45ed-8fc8-d400f00c80fd-kube-api-access-ph62t\") pod \"marketplace-operator-79b997595-njsvg\" (UID: \"813c7436-6b2f-45ed-8fc8-d400f00c80fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.129799 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph62t\" (UniqueName: \"kubernetes.io/projected/813c7436-6b2f-45ed-8fc8-d400f00c80fd-kube-api-access-ph62t\") pod \"marketplace-operator-79b997595-njsvg\" (UID: \"813c7436-6b2f-45ed-8fc8-d400f00c80fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.130541 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/813c7436-6b2f-45ed-8fc8-d400f00c80fd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-njsvg\" (UID: \"813c7436-6b2f-45ed-8fc8-d400f00c80fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.130612 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/813c7436-6b2f-45ed-8fc8-d400f00c80fd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-njsvg\" (UID: \"813c7436-6b2f-45ed-8fc8-d400f00c80fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.135415 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/813c7436-6b2f-45ed-8fc8-d400f00c80fd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-njsvg\" (UID: \"813c7436-6b2f-45ed-8fc8-d400f00c80fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.147200 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/813c7436-6b2f-45ed-8fc8-d400f00c80fd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-njsvg\" (UID: \"813c7436-6b2f-45ed-8fc8-d400f00c80fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.169668 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph62t\" (UniqueName: \"kubernetes.io/projected/813c7436-6b2f-45ed-8fc8-d400f00c80fd-kube-api-access-ph62t\") pod \"marketplace-operator-79b997595-njsvg\" (UID: \"813c7436-6b2f-45ed-8fc8-d400f00c80fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.193452 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.301698 4717 generic.go:334] "Generic (PLEG): container finished" podID="2fbfa975-4e24-4213-8368-ba1af6b39e21" containerID="316e0e26038483307357c997ce468ecf66894792a94a9d08310f6b194d6ba6db" exitCode=0 Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.301804 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" event={"ID":"2fbfa975-4e24-4213-8368-ba1af6b39e21","Type":"ContainerDied","Data":"316e0e26038483307357c997ce468ecf66894792a94a9d08310f6b194d6ba6db"} Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.301892 4717 scope.go:117] "RemoveContainer" containerID="6dca40a0f9057be15062a06f09179c08d0369bc7da7a166c9c75fa49c718c09a" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.305708 4717 generic.go:334] "Generic (PLEG): container finished" podID="c61c5e17-781e-42f4-852d-0e0604721b86" containerID="a7d0f93f2303fa6c27c9343bf49fab182a1cfc956a3d3c1881a03811652bf863" exitCode=0 Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.305781 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dsx5v" event={"ID":"c61c5e17-781e-42f4-852d-0e0604721b86","Type":"ContainerDied","Data":"a7d0f93f2303fa6c27c9343bf49fab182a1cfc956a3d3c1881a03811652bf863"} Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.307840 4717 generic.go:334] "Generic (PLEG): container finished" podID="0bae9eeb-1b53-44fd-9751-7b03463ceaf9" containerID="060b9fdcf6947089e74c80da0d707e01f5e25987f36c129028236dea4049fed4" exitCode=0 Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.307899 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbvbf" event={"ID":"0bae9eeb-1b53-44fd-9751-7b03463ceaf9","Type":"ContainerDied","Data":"060b9fdcf6947089e74c80da0d707e01f5e25987f36c129028236dea4049fed4"} Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.314425 4717 generic.go:334] "Generic (PLEG): container finished" podID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" containerID="c65edb7a4a9e69fd801c9e7a08ed2ee3ccd50bce7071b2c71049f23861ae672b" exitCode=0 Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.314510 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2gw7" event={"ID":"f91b518f-7204-4a6b-bcfb-bc763ab14d2b","Type":"ContainerDied","Data":"c65edb7a4a9e69fd801c9e7a08ed2ee3ccd50bce7071b2c71049f23861ae672b"} Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.318397 4717 generic.go:334] "Generic (PLEG): container finished" podID="59890efa-3bd7-474e-b962-99c705159847" containerID="6d91d88403cab3e8ff653d0d0d873034df34c990b6022b519a5cec16d6230ce8" exitCode=0 Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.318427 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-49dn7" event={"ID":"59890efa-3bd7-474e-b962-99c705159847","Type":"ContainerDied","Data":"6d91d88403cab3e8ff653d0d0d873034df34c990b6022b519a5cec16d6230ce8"} Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.569223 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.615129 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.634469 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.642716 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-catalog-content\") pod \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\" (UID: \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\") " Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.642964 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgp6p\" (UniqueName: \"kubernetes.io/projected/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-kube-api-access-dgp6p\") pod \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\" (UID: \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\") " Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.643423 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-utilities\") pod \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\" (UID: \"0bae9eeb-1b53-44fd-9751-7b03463ceaf9\") " Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.644693 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-utilities" (OuterVolumeSpecName: "utilities") pod "0bae9eeb-1b53-44fd-9751-7b03463ceaf9" (UID: "0bae9eeb-1b53-44fd-9751-7b03463ceaf9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.649615 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-kube-api-access-dgp6p" (OuterVolumeSpecName: "kube-api-access-dgp6p") pod "0bae9eeb-1b53-44fd-9751-7b03463ceaf9" (UID: "0bae9eeb-1b53-44fd-9751-7b03463ceaf9"). InnerVolumeSpecName "kube-api-access-dgp6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.714746 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bae9eeb-1b53-44fd-9751-7b03463ceaf9" (UID: "0bae9eeb-1b53-44fd-9751-7b03463ceaf9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.745249 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-operator-metrics\") pod \"2fbfa975-4e24-4213-8368-ba1af6b39e21\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.745327 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2g2p\" (UniqueName: \"kubernetes.io/projected/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-kube-api-access-q2g2p\") pod \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\" (UID: \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\") " Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.745443 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkpwb\" (UniqueName: \"kubernetes.io/projected/2fbfa975-4e24-4213-8368-ba1af6b39e21-kube-api-access-bkpwb\") pod \"2fbfa975-4e24-4213-8368-ba1af6b39e21\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.745478 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-trusted-ca\") pod \"2fbfa975-4e24-4213-8368-ba1af6b39e21\" (UID: \"2fbfa975-4e24-4213-8368-ba1af6b39e21\") " Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.745551 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-catalog-content\") pod \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\" (UID: \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\") " Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.745604 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-utilities\") pod \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\" (UID: \"f91b518f-7204-4a6b-bcfb-bc763ab14d2b\") " Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.746042 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.746073 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.746120 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgp6p\" (UniqueName: \"kubernetes.io/projected/0bae9eeb-1b53-44fd-9751-7b03463ceaf9-kube-api-access-dgp6p\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.747623 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2fbfa975-4e24-4213-8368-ba1af6b39e21" (UID: "2fbfa975-4e24-4213-8368-ba1af6b39e21"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.747860 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-utilities" (OuterVolumeSpecName: "utilities") pod "f91b518f-7204-4a6b-bcfb-bc763ab14d2b" (UID: "f91b518f-7204-4a6b-bcfb-bc763ab14d2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.753744 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-kube-api-access-q2g2p" (OuterVolumeSpecName: "kube-api-access-q2g2p") pod "f91b518f-7204-4a6b-bcfb-bc763ab14d2b" (UID: "f91b518f-7204-4a6b-bcfb-bc763ab14d2b"). InnerVolumeSpecName "kube-api-access-q2g2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.755871 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2fbfa975-4e24-4213-8368-ba1af6b39e21" (UID: "2fbfa975-4e24-4213-8368-ba1af6b39e21"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.755987 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbfa975-4e24-4213-8368-ba1af6b39e21-kube-api-access-bkpwb" (OuterVolumeSpecName: "kube-api-access-bkpwb") pod "2fbfa975-4e24-4213-8368-ba1af6b39e21" (UID: "2fbfa975-4e24-4213-8368-ba1af6b39e21"). InnerVolumeSpecName "kube-api-access-bkpwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.790558 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f91b518f-7204-4a6b-bcfb-bc763ab14d2b" (UID: "f91b518f-7204-4a6b-bcfb-bc763ab14d2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.845050 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-njsvg"] Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.847257 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.847280 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2g2p\" (UniqueName: \"kubernetes.io/projected/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-kube-api-access-q2g2p\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.847290 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkpwb\" (UniqueName: \"kubernetes.io/projected/2fbfa975-4e24-4213-8368-ba1af6b39e21-kube-api-access-bkpwb\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.847305 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbfa975-4e24-4213-8368-ba1af6b39e21-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.847316 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.847327 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f91b518f-7204-4a6b-bcfb-bc763ab14d2b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:53 crc kubenswrapper[4717]: W0217 14:58:53.850717 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod813c7436_6b2f_45ed_8fc8_d400f00c80fd.slice/crio-b7e733f0689913689ee3b10a3f24f67948c42a33bf8efabf01a89cd777623a54 WatchSource:0}: Error finding container b7e733f0689913689ee3b10a3f24f67948c42a33bf8efabf01a89cd777623a54: Status 404 returned error can't find the container with id b7e733f0689913689ee3b10a3f24f67948c42a33bf8efabf01a89cd777623a54 Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.875767 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.948312 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54mqj\" (UniqueName: \"kubernetes.io/projected/c61c5e17-781e-42f4-852d-0e0604721b86-kube-api-access-54mqj\") pod \"c61c5e17-781e-42f4-852d-0e0604721b86\" (UID: \"c61c5e17-781e-42f4-852d-0e0604721b86\") " Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.948372 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c61c5e17-781e-42f4-852d-0e0604721b86-catalog-content\") pod \"c61c5e17-781e-42f4-852d-0e0604721b86\" (UID: \"c61c5e17-781e-42f4-852d-0e0604721b86\") " Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.948532 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c61c5e17-781e-42f4-852d-0e0604721b86-utilities\") pod \"c61c5e17-781e-42f4-852d-0e0604721b86\" (UID: \"c61c5e17-781e-42f4-852d-0e0604721b86\") " Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.949728 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c61c5e17-781e-42f4-852d-0e0604721b86-utilities" (OuterVolumeSpecName: "utilities") pod "c61c5e17-781e-42f4-852d-0e0604721b86" (UID: "c61c5e17-781e-42f4-852d-0e0604721b86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:58:53 crc kubenswrapper[4717]: I0217 14:58:53.954290 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c61c5e17-781e-42f4-852d-0e0604721b86-kube-api-access-54mqj" (OuterVolumeSpecName: "kube-api-access-54mqj") pod "c61c5e17-781e-42f4-852d-0e0604721b86" (UID: "c61c5e17-781e-42f4-852d-0e0604721b86"). InnerVolumeSpecName "kube-api-access-54mqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.001780 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.053577 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c61c5e17-781e-42f4-852d-0e0604721b86-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.053619 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54mqj\" (UniqueName: \"kubernetes.io/projected/c61c5e17-781e-42f4-852d-0e0604721b86-kube-api-access-54mqj\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.091338 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c61c5e17-781e-42f4-852d-0e0604721b86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c61c5e17-781e-42f4-852d-0e0604721b86" (UID: "c61c5e17-781e-42f4-852d-0e0604721b86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.154542 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59890efa-3bd7-474e-b962-99c705159847-catalog-content\") pod \"59890efa-3bd7-474e-b962-99c705159847\" (UID: \"59890efa-3bd7-474e-b962-99c705159847\") " Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.154608 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjqqc\" (UniqueName: \"kubernetes.io/projected/59890efa-3bd7-474e-b962-99c705159847-kube-api-access-pjqqc\") pod \"59890efa-3bd7-474e-b962-99c705159847\" (UID: \"59890efa-3bd7-474e-b962-99c705159847\") " Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.154670 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59890efa-3bd7-474e-b962-99c705159847-utilities\") pod \"59890efa-3bd7-474e-b962-99c705159847\" (UID: \"59890efa-3bd7-474e-b962-99c705159847\") " Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.155063 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c61c5e17-781e-42f4-852d-0e0604721b86-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.155956 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59890efa-3bd7-474e-b962-99c705159847-utilities" (OuterVolumeSpecName: "utilities") pod "59890efa-3bd7-474e-b962-99c705159847" (UID: "59890efa-3bd7-474e-b962-99c705159847"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.159845 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59890efa-3bd7-474e-b962-99c705159847-kube-api-access-pjqqc" (OuterVolumeSpecName: "kube-api-access-pjqqc") pod "59890efa-3bd7-474e-b962-99c705159847" (UID: "59890efa-3bd7-474e-b962-99c705159847"). InnerVolumeSpecName "kube-api-access-pjqqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.213166 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59890efa-3bd7-474e-b962-99c705159847-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59890efa-3bd7-474e-b962-99c705159847" (UID: "59890efa-3bd7-474e-b962-99c705159847"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.256429 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59890efa-3bd7-474e-b962-99c705159847-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.256486 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjqqc\" (UniqueName: \"kubernetes.io/projected/59890efa-3bd7-474e-b962-99c705159847-kube-api-access-pjqqc\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.256508 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59890efa-3bd7-474e-b962-99c705159847-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.327284 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbvbf" event={"ID":"0bae9eeb-1b53-44fd-9751-7b03463ceaf9","Type":"ContainerDied","Data":"391a6d19b35f88460acf3cf60717b553987886ac9aa9e5a3759403bb83cd8173"} Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.327389 4717 scope.go:117] "RemoveContainer" containerID="060b9fdcf6947089e74c80da0d707e01f5e25987f36c129028236dea4049fed4" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.327482 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbvbf" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.330632 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2gw7" event={"ID":"f91b518f-7204-4a6b-bcfb-bc763ab14d2b","Type":"ContainerDied","Data":"5bbf2495e9f9f8da4910ea349ad4c74df30e1393f3c217d7eb89f63aa7d9d1aa"} Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.330792 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2gw7" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.335751 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-49dn7" event={"ID":"59890efa-3bd7-474e-b962-99c705159847","Type":"ContainerDied","Data":"2e2f941eab79d6f8bcd434e7f44852b7dcd7af7e66ade07ac979be436204b1a4"} Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.335849 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-49dn7" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.341245 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" event={"ID":"813c7436-6b2f-45ed-8fc8-d400f00c80fd","Type":"ContainerStarted","Data":"256c0ede621fea3f908a33bdb417b2dc3284109c5aedaeaf2f59b764e08e73f1"} Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.341283 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" event={"ID":"813c7436-6b2f-45ed-8fc8-d400f00c80fd","Type":"ContainerStarted","Data":"b7e733f0689913689ee3b10a3f24f67948c42a33bf8efabf01a89cd777623a54"} Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.344182 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.344241 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bxg9v" event={"ID":"2fbfa975-4e24-4213-8368-ba1af6b39e21","Type":"ContainerDied","Data":"badce36dbfe5f4026d72dc734e534b9ea913545200d7dc87f4d0a7474332f1c2"} Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.347980 4717 scope.go:117] "RemoveContainer" containerID="72acb4aa00269a342bc4ea255ca0b24ea4e9ec8baaecc368b8d8478d99400cef" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.349887 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dsx5v" event={"ID":"c61c5e17-781e-42f4-852d-0e0604721b86","Type":"ContainerDied","Data":"be0f9358af7ff3511338d1db489ab7079287a831b3314b6687cc8bf630813b83"} Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.349988 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dsx5v" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.364250 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tbvbf"] Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.378409 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tbvbf"] Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.383559 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2gw7"] Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.384737 4717 scope.go:117] "RemoveContainer" containerID="ae3f76cad3af7b1848b1007ae54ee1981e9e1cc87d573d066ac5b03ea2d57a3e" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.389213 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2gw7"] Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.399040 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bxg9v"] Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.405663 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bxg9v"] Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.413602 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-49dn7"] Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.415877 4717 scope.go:117] "RemoveContainer" containerID="c65edb7a4a9e69fd801c9e7a08ed2ee3ccd50bce7071b2c71049f23861ae672b" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.421004 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-49dn7"] Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.440843 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dsx5v"] Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.445614 4717 scope.go:117] "RemoveContainer" containerID="cbd4098d589c6a977506dcdb654896865fb61f6b4602773be1c2f8aa3f37a80e" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.449362 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dsx5v"] Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.481190 4717 scope.go:117] "RemoveContainer" containerID="c3cae9a9178cf8ee9684b2d34fc4ba2ad8993b4376b8dbbda96c0627f8ec3c25" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.497661 4717 scope.go:117] "RemoveContainer" containerID="6d91d88403cab3e8ff653d0d0d873034df34c990b6022b519a5cec16d6230ce8" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.512882 4717 scope.go:117] "RemoveContainer" containerID="22df71ad461663447fabe0d2b96aaadc8f48852086feb5a62c82e67ac41ff3e3" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.528282 4717 scope.go:117] "RemoveContainer" containerID="a225dcabf286ffcf4d7bcafa646434ca74efc675a58e106c2aa8c93ad38de9b5" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.543540 4717 scope.go:117] "RemoveContainer" containerID="316e0e26038483307357c997ce468ecf66894792a94a9d08310f6b194d6ba6db" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.561056 4717 scope.go:117] "RemoveContainer" containerID="a7d0f93f2303fa6c27c9343bf49fab182a1cfc956a3d3c1881a03811652bf863" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.580348 4717 scope.go:117] "RemoveContainer" containerID="6a0be846ae39e6e31c598632bcab84580d85b50efb092094f84ebe0f80bdd9ac" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.599174 4717 scope.go:117] "RemoveContainer" containerID="d80d0309d05208bda67a874cabca4a6b6dbabba774d30fe7d9d64c2791079467" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988287 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jk5wb"] Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988493 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59890efa-3bd7-474e-b962-99c705159847" containerName="extract-content" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988505 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="59890efa-3bd7-474e-b962-99c705159847" containerName="extract-content" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988516 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59890efa-3bd7-474e-b962-99c705159847" containerName="registry-server" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988522 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="59890efa-3bd7-474e-b962-99c705159847" containerName="registry-server" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988530 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbfa975-4e24-4213-8368-ba1af6b39e21" containerName="marketplace-operator" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988537 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbfa975-4e24-4213-8368-ba1af6b39e21" containerName="marketplace-operator" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988546 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61c5e17-781e-42f4-852d-0e0604721b86" containerName="registry-server" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988551 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61c5e17-781e-42f4-852d-0e0604721b86" containerName="registry-server" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988560 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" containerName="extract-content" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988566 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" containerName="extract-content" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988574 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bae9eeb-1b53-44fd-9751-7b03463ceaf9" containerName="extract-utilities" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988581 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bae9eeb-1b53-44fd-9751-7b03463ceaf9" containerName="extract-utilities" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988590 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" containerName="registry-server" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988597 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" containerName="registry-server" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988606 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" containerName="extract-utilities" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988611 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" containerName="extract-utilities" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988621 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61c5e17-781e-42f4-852d-0e0604721b86" containerName="extract-utilities" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988628 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61c5e17-781e-42f4-852d-0e0604721b86" containerName="extract-utilities" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988637 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bae9eeb-1b53-44fd-9751-7b03463ceaf9" containerName="registry-server" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988644 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bae9eeb-1b53-44fd-9751-7b03463ceaf9" containerName="registry-server" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988652 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61c5e17-781e-42f4-852d-0e0604721b86" containerName="extract-content" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988657 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61c5e17-781e-42f4-852d-0e0604721b86" containerName="extract-content" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988664 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59890efa-3bd7-474e-b962-99c705159847" containerName="extract-utilities" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988671 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="59890efa-3bd7-474e-b962-99c705159847" containerName="extract-utilities" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988679 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbfa975-4e24-4213-8368-ba1af6b39e21" containerName="marketplace-operator" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988685 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbfa975-4e24-4213-8368-ba1af6b39e21" containerName="marketplace-operator" Feb 17 14:58:54 crc kubenswrapper[4717]: E0217 14:58:54.988698 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bae9eeb-1b53-44fd-9751-7b03463ceaf9" containerName="extract-content" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988704 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bae9eeb-1b53-44fd-9751-7b03463ceaf9" containerName="extract-content" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988782 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" containerName="registry-server" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988792 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fbfa975-4e24-4213-8368-ba1af6b39e21" containerName="marketplace-operator" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988801 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61c5e17-781e-42f4-852d-0e0604721b86" containerName="registry-server" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988810 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="59890efa-3bd7-474e-b962-99c705159847" containerName="registry-server" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988828 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bae9eeb-1b53-44fd-9751-7b03463ceaf9" containerName="registry-server" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.988992 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fbfa975-4e24-4213-8368-ba1af6b39e21" containerName="marketplace-operator" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.989616 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:58:54 crc kubenswrapper[4717]: I0217 14:58:54.993690 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.005958 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jk5wb"] Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.066965 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c636add4-d170-4295-8eeb-3f4972cf20d0-utilities\") pod \"redhat-marketplace-jk5wb\" (UID: \"c636add4-d170-4295-8eeb-3f4972cf20d0\") " pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.067044 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c636add4-d170-4295-8eeb-3f4972cf20d0-catalog-content\") pod \"redhat-marketplace-jk5wb\" (UID: \"c636add4-d170-4295-8eeb-3f4972cf20d0\") " pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.067073 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz4m7\" (UniqueName: \"kubernetes.io/projected/c636add4-d170-4295-8eeb-3f4972cf20d0-kube-api-access-lz4m7\") pod \"redhat-marketplace-jk5wb\" (UID: \"c636add4-d170-4295-8eeb-3f4972cf20d0\") " pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.168203 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c636add4-d170-4295-8eeb-3f4972cf20d0-utilities\") pod \"redhat-marketplace-jk5wb\" (UID: \"c636add4-d170-4295-8eeb-3f4972cf20d0\") " pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.168305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c636add4-d170-4295-8eeb-3f4972cf20d0-catalog-content\") pod \"redhat-marketplace-jk5wb\" (UID: \"c636add4-d170-4295-8eeb-3f4972cf20d0\") " pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.168339 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz4m7\" (UniqueName: \"kubernetes.io/projected/c636add4-d170-4295-8eeb-3f4972cf20d0-kube-api-access-lz4m7\") pod \"redhat-marketplace-jk5wb\" (UID: \"c636add4-d170-4295-8eeb-3f4972cf20d0\") " pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.169223 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c636add4-d170-4295-8eeb-3f4972cf20d0-catalog-content\") pod \"redhat-marketplace-jk5wb\" (UID: \"c636add4-d170-4295-8eeb-3f4972cf20d0\") " pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.169245 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c636add4-d170-4295-8eeb-3f4972cf20d0-utilities\") pod \"redhat-marketplace-jk5wb\" (UID: \"c636add4-d170-4295-8eeb-3f4972cf20d0\") " pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.200225 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz4m7\" (UniqueName: \"kubernetes.io/projected/c636add4-d170-4295-8eeb-3f4972cf20d0-kube-api-access-lz4m7\") pod \"redhat-marketplace-jk5wb\" (UID: \"c636add4-d170-4295-8eeb-3f4972cf20d0\") " pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.205302 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b4kxp"] Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.207434 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.210746 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.214842 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b4kxp"] Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.321896 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.364989 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.370736 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43eecc72-1d78-474c-b7c3-a773e72acb6a-utilities\") pod \"certified-operators-b4kxp\" (UID: \"43eecc72-1d78-474c-b7c3-a773e72acb6a\") " pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.370791 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7nrk\" (UniqueName: \"kubernetes.io/projected/43eecc72-1d78-474c-b7c3-a773e72acb6a-kube-api-access-c7nrk\") pod \"certified-operators-b4kxp\" (UID: \"43eecc72-1d78-474c-b7c3-a773e72acb6a\") " pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.370948 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.371309 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43eecc72-1d78-474c-b7c3-a773e72acb6a-catalog-content\") pod \"certified-operators-b4kxp\" (UID: \"43eecc72-1d78-474c-b7c3-a773e72acb6a\") " pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.383497 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-njsvg" podStartSLOduration=3.3834831850000002 podStartE2EDuration="3.383483185s" podCreationTimestamp="2026-02-17 14:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:58:55.383110964 +0000 UTC m=+401.798951450" watchObservedRunningTime="2026-02-17 14:58:55.383483185 +0000 UTC m=+401.799323661" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.474619 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43eecc72-1d78-474c-b7c3-a773e72acb6a-catalog-content\") pod \"certified-operators-b4kxp\" (UID: \"43eecc72-1d78-474c-b7c3-a773e72acb6a\") " pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.475308 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43eecc72-1d78-474c-b7c3-a773e72acb6a-utilities\") pod \"certified-operators-b4kxp\" (UID: \"43eecc72-1d78-474c-b7c3-a773e72acb6a\") " pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.475357 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7nrk\" (UniqueName: \"kubernetes.io/projected/43eecc72-1d78-474c-b7c3-a773e72acb6a-kube-api-access-c7nrk\") pod \"certified-operators-b4kxp\" (UID: \"43eecc72-1d78-474c-b7c3-a773e72acb6a\") " pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.477137 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43eecc72-1d78-474c-b7c3-a773e72acb6a-catalog-content\") pod \"certified-operators-b4kxp\" (UID: \"43eecc72-1d78-474c-b7c3-a773e72acb6a\") " pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.477668 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43eecc72-1d78-474c-b7c3-a773e72acb6a-utilities\") pod \"certified-operators-b4kxp\" (UID: \"43eecc72-1d78-474c-b7c3-a773e72acb6a\") " pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.502934 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7nrk\" (UniqueName: \"kubernetes.io/projected/43eecc72-1d78-474c-b7c3-a773e72acb6a-kube-api-access-c7nrk\") pod \"certified-operators-b4kxp\" (UID: \"43eecc72-1d78-474c-b7c3-a773e72acb6a\") " pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.530388 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.773829 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jk5wb"] Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.854332 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bae9eeb-1b53-44fd-9751-7b03463ceaf9" path="/var/lib/kubelet/pods/0bae9eeb-1b53-44fd-9751-7b03463ceaf9/volumes" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.855115 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fbfa975-4e24-4213-8368-ba1af6b39e21" path="/var/lib/kubelet/pods/2fbfa975-4e24-4213-8368-ba1af6b39e21/volumes" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.855590 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59890efa-3bd7-474e-b962-99c705159847" path="/var/lib/kubelet/pods/59890efa-3bd7-474e-b962-99c705159847/volumes" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.856747 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c61c5e17-781e-42f4-852d-0e0604721b86" path="/var/lib/kubelet/pods/c61c5e17-781e-42f4-852d-0e0604721b86/volumes" Feb 17 14:58:55 crc kubenswrapper[4717]: I0217 14:58:55.857426 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91b518f-7204-4a6b-bcfb-bc763ab14d2b" path="/var/lib/kubelet/pods/f91b518f-7204-4a6b-bcfb-bc763ab14d2b/volumes" Feb 17 14:58:56 crc kubenswrapper[4717]: I0217 14:58:56.007129 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b4kxp"] Feb 17 14:58:56 crc kubenswrapper[4717]: W0217 14:58:56.012772 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43eecc72_1d78_474c_b7c3_a773e72acb6a.slice/crio-e0f93c3c3d27b0b6fdb78a42e1738983c43d72fa243c18b5101fdf458ddbb160 WatchSource:0}: Error finding container e0f93c3c3d27b0b6fdb78a42e1738983c43d72fa243c18b5101fdf458ddbb160: Status 404 returned error can't find the container with id e0f93c3c3d27b0b6fdb78a42e1738983c43d72fa243c18b5101fdf458ddbb160 Feb 17 14:58:56 crc kubenswrapper[4717]: I0217 14:58:56.373116 4717 generic.go:334] "Generic (PLEG): container finished" podID="43eecc72-1d78-474c-b7c3-a773e72acb6a" containerID="23048038f7018eb1ddd88d83eb24e9ca41f6fc544294bad774bd73c58c42f4d3" exitCode=0 Feb 17 14:58:56 crc kubenswrapper[4717]: I0217 14:58:56.373239 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4kxp" event={"ID":"43eecc72-1d78-474c-b7c3-a773e72acb6a","Type":"ContainerDied","Data":"23048038f7018eb1ddd88d83eb24e9ca41f6fc544294bad774bd73c58c42f4d3"} Feb 17 14:58:56 crc kubenswrapper[4717]: I0217 14:58:56.373331 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4kxp" event={"ID":"43eecc72-1d78-474c-b7c3-a773e72acb6a","Type":"ContainerStarted","Data":"e0f93c3c3d27b0b6fdb78a42e1738983c43d72fa243c18b5101fdf458ddbb160"} Feb 17 14:58:56 crc kubenswrapper[4717]: I0217 14:58:56.377598 4717 generic.go:334] "Generic (PLEG): container finished" podID="c636add4-d170-4295-8eeb-3f4972cf20d0" containerID="557b14ec58d32cdd980fea693b707177c01c7f8bda3225adb009c8c8bd415155" exitCode=0 Feb 17 14:58:56 crc kubenswrapper[4717]: I0217 14:58:56.378029 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jk5wb" event={"ID":"c636add4-d170-4295-8eeb-3f4972cf20d0","Type":"ContainerDied","Data":"557b14ec58d32cdd980fea693b707177c01c7f8bda3225adb009c8c8bd415155"} Feb 17 14:58:56 crc kubenswrapper[4717]: I0217 14:58:56.378097 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jk5wb" event={"ID":"c636add4-d170-4295-8eeb-3f4972cf20d0","Type":"ContainerStarted","Data":"00259c5a5d8bc009c771edfe9238adc5036c712359ed8ce1cc5829dc0a12149b"} Feb 17 14:58:56 crc kubenswrapper[4717]: I0217 14:58:56.635367 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wkkwn" Feb 17 14:58:56 crc kubenswrapper[4717]: I0217 14:58:56.704390 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mvgrd"] Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.396801 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jb79c"] Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.401182 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.407361 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.418222 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jb79c"] Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.510119 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ed0865-16aa-42df-94e8-c228b0baf920-catalog-content\") pod \"redhat-operators-jb79c\" (UID: \"f2ed0865-16aa-42df-94e8-c228b0baf920\") " pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.510233 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ed0865-16aa-42df-94e8-c228b0baf920-utilities\") pod \"redhat-operators-jb79c\" (UID: \"f2ed0865-16aa-42df-94e8-c228b0baf920\") " pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.510474 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c9bk\" (UniqueName: \"kubernetes.io/projected/f2ed0865-16aa-42df-94e8-c228b0baf920-kube-api-access-6c9bk\") pod \"redhat-operators-jb79c\" (UID: \"f2ed0865-16aa-42df-94e8-c228b0baf920\") " pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.589378 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rb6w5"] Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.590417 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.596315 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.605288 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rb6w5"] Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.611737 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c9bk\" (UniqueName: \"kubernetes.io/projected/f2ed0865-16aa-42df-94e8-c228b0baf920-kube-api-access-6c9bk\") pod \"redhat-operators-jb79c\" (UID: \"f2ed0865-16aa-42df-94e8-c228b0baf920\") " pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.611846 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ed0865-16aa-42df-94e8-c228b0baf920-catalog-content\") pod \"redhat-operators-jb79c\" (UID: \"f2ed0865-16aa-42df-94e8-c228b0baf920\") " pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.611897 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ed0865-16aa-42df-94e8-c228b0baf920-utilities\") pod \"redhat-operators-jb79c\" (UID: \"f2ed0865-16aa-42df-94e8-c228b0baf920\") " pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.612744 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ed0865-16aa-42df-94e8-c228b0baf920-utilities\") pod \"redhat-operators-jb79c\" (UID: \"f2ed0865-16aa-42df-94e8-c228b0baf920\") " pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.613017 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ed0865-16aa-42df-94e8-c228b0baf920-catalog-content\") pod \"redhat-operators-jb79c\" (UID: \"f2ed0865-16aa-42df-94e8-c228b0baf920\") " pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.638693 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c9bk\" (UniqueName: \"kubernetes.io/projected/f2ed0865-16aa-42df-94e8-c228b0baf920-kube-api-access-6c9bk\") pod \"redhat-operators-jb79c\" (UID: \"f2ed0865-16aa-42df-94e8-c228b0baf920\") " pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.713260 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46c61af5-321a-4195-8d6b-a95c65c5eae3-utilities\") pod \"community-operators-rb6w5\" (UID: \"46c61af5-321a-4195-8d6b-a95c65c5eae3\") " pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.713349 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stw6q\" (UniqueName: \"kubernetes.io/projected/46c61af5-321a-4195-8d6b-a95c65c5eae3-kube-api-access-stw6q\") pod \"community-operators-rb6w5\" (UID: \"46c61af5-321a-4195-8d6b-a95c65c5eae3\") " pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.713373 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46c61af5-321a-4195-8d6b-a95c65c5eae3-catalog-content\") pod \"community-operators-rb6w5\" (UID: \"46c61af5-321a-4195-8d6b-a95c65c5eae3\") " pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.731554 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.814818 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46c61af5-321a-4195-8d6b-a95c65c5eae3-utilities\") pod \"community-operators-rb6w5\" (UID: \"46c61af5-321a-4195-8d6b-a95c65c5eae3\") " pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.814881 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stw6q\" (UniqueName: \"kubernetes.io/projected/46c61af5-321a-4195-8d6b-a95c65c5eae3-kube-api-access-stw6q\") pod \"community-operators-rb6w5\" (UID: \"46c61af5-321a-4195-8d6b-a95c65c5eae3\") " pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.814933 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46c61af5-321a-4195-8d6b-a95c65c5eae3-catalog-content\") pod \"community-operators-rb6w5\" (UID: \"46c61af5-321a-4195-8d6b-a95c65c5eae3\") " pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.815482 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46c61af5-321a-4195-8d6b-a95c65c5eae3-catalog-content\") pod \"community-operators-rb6w5\" (UID: \"46c61af5-321a-4195-8d6b-a95c65c5eae3\") " pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.815710 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46c61af5-321a-4195-8d6b-a95c65c5eae3-utilities\") pod \"community-operators-rb6w5\" (UID: \"46c61af5-321a-4195-8d6b-a95c65c5eae3\") " pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.845524 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stw6q\" (UniqueName: \"kubernetes.io/projected/46c61af5-321a-4195-8d6b-a95c65c5eae3-kube-api-access-stw6q\") pod \"community-operators-rb6w5\" (UID: \"46c61af5-321a-4195-8d6b-a95c65c5eae3\") " pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:58:57 crc kubenswrapper[4717]: I0217 14:58:57.920805 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:58:58 crc kubenswrapper[4717]: I0217 14:58:58.147011 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rb6w5"] Feb 17 14:58:58 crc kubenswrapper[4717]: I0217 14:58:58.179382 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jb79c"] Feb 17 14:58:58 crc kubenswrapper[4717]: W0217 14:58:58.190551 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2ed0865_16aa_42df_94e8_c228b0baf920.slice/crio-566d8ff9ae0765fe05847e36a3298e428f46b57fbe0b4094e3c64033710a7333 WatchSource:0}: Error finding container 566d8ff9ae0765fe05847e36a3298e428f46b57fbe0b4094e3c64033710a7333: Status 404 returned error can't find the container with id 566d8ff9ae0765fe05847e36a3298e428f46b57fbe0b4094e3c64033710a7333 Feb 17 14:58:58 crc kubenswrapper[4717]: I0217 14:58:58.392173 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rb6w5" event={"ID":"46c61af5-321a-4195-8d6b-a95c65c5eae3","Type":"ContainerStarted","Data":"3390bef4b25d73219a4d40bdc0f1e55dee931c0ba627ee13ed9445c627cfdc06"} Feb 17 14:58:58 crc kubenswrapper[4717]: I0217 14:58:58.394863 4717 generic.go:334] "Generic (PLEG): container finished" podID="c636add4-d170-4295-8eeb-3f4972cf20d0" containerID="9e508b693fd2324904ffa17a6fbeff8be5ce6bf89d79c2f0684fe796bbd971c0" exitCode=0 Feb 17 14:58:58 crc kubenswrapper[4717]: I0217 14:58:58.394924 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jk5wb" event={"ID":"c636add4-d170-4295-8eeb-3f4972cf20d0","Type":"ContainerDied","Data":"9e508b693fd2324904ffa17a6fbeff8be5ce6bf89d79c2f0684fe796bbd971c0"} Feb 17 14:58:58 crc kubenswrapper[4717]: I0217 14:58:58.395821 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb79c" event={"ID":"f2ed0865-16aa-42df-94e8-c228b0baf920","Type":"ContainerStarted","Data":"566d8ff9ae0765fe05847e36a3298e428f46b57fbe0b4094e3c64033710a7333"} Feb 17 14:58:59 crc kubenswrapper[4717]: I0217 14:58:59.404394 4717 generic.go:334] "Generic (PLEG): container finished" podID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerID="1a25c373aec7d594536cd1fb01e6aa1575bbdc3e9554f50fbf21f0cc49bf6f35" exitCode=0 Feb 17 14:58:59 crc kubenswrapper[4717]: I0217 14:58:59.404509 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb79c" event={"ID":"f2ed0865-16aa-42df-94e8-c228b0baf920","Type":"ContainerDied","Data":"1a25c373aec7d594536cd1fb01e6aa1575bbdc3e9554f50fbf21f0cc49bf6f35"} Feb 17 14:58:59 crc kubenswrapper[4717]: I0217 14:58:59.407431 4717 generic.go:334] "Generic (PLEG): container finished" podID="43eecc72-1d78-474c-b7c3-a773e72acb6a" containerID="457d463fd45eec7ae00b6f0f78bff55097105ea5e84a8f9837229920cdb402c1" exitCode=0 Feb 17 14:58:59 crc kubenswrapper[4717]: I0217 14:58:59.408200 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4kxp" event={"ID":"43eecc72-1d78-474c-b7c3-a773e72acb6a","Type":"ContainerDied","Data":"457d463fd45eec7ae00b6f0f78bff55097105ea5e84a8f9837229920cdb402c1"} Feb 17 14:58:59 crc kubenswrapper[4717]: I0217 14:58:59.410733 4717 generic.go:334] "Generic (PLEG): container finished" podID="46c61af5-321a-4195-8d6b-a95c65c5eae3" containerID="c6b6d48e33edc00e11440d8786a415cdfbbe502bf630f20da9916b1a68411a58" exitCode=0 Feb 17 14:58:59 crc kubenswrapper[4717]: I0217 14:58:59.410861 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rb6w5" event={"ID":"46c61af5-321a-4195-8d6b-a95c65c5eae3","Type":"ContainerDied","Data":"c6b6d48e33edc00e11440d8786a415cdfbbe502bf630f20da9916b1a68411a58"} Feb 17 14:58:59 crc kubenswrapper[4717]: I0217 14:58:59.429695 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jk5wb" event={"ID":"c636add4-d170-4295-8eeb-3f4972cf20d0","Type":"ContainerStarted","Data":"3e11ed4085a85d31ba84de015bf4297bfba64d211003b2552e48b75b799913d2"} Feb 17 14:58:59 crc kubenswrapper[4717]: I0217 14:58:59.538969 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jk5wb" podStartSLOduration=2.763570686 podStartE2EDuration="5.538937236s" podCreationTimestamp="2026-02-17 14:58:54 +0000 UTC" firstStartedPulling="2026-02-17 14:58:56.380614602 +0000 UTC m=+402.796455078" lastFinishedPulling="2026-02-17 14:58:59.155981152 +0000 UTC m=+405.571821628" observedRunningTime="2026-02-17 14:58:59.534889517 +0000 UTC m=+405.950730013" watchObservedRunningTime="2026-02-17 14:58:59.538937236 +0000 UTC m=+405.954777712" Feb 17 14:59:01 crc kubenswrapper[4717]: I0217 14:59:01.448853 4717 generic.go:334] "Generic (PLEG): container finished" podID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerID="b957085354d7503c278d5292e787777409672d3bdafa1a0351e82f0278d16162" exitCode=0 Feb 17 14:59:01 crc kubenswrapper[4717]: I0217 14:59:01.448976 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb79c" event={"ID":"f2ed0865-16aa-42df-94e8-c228b0baf920","Type":"ContainerDied","Data":"b957085354d7503c278d5292e787777409672d3bdafa1a0351e82f0278d16162"} Feb 17 14:59:01 crc kubenswrapper[4717]: I0217 14:59:01.455528 4717 generic.go:334] "Generic (PLEG): container finished" podID="46c61af5-321a-4195-8d6b-a95c65c5eae3" containerID="1d487ff9d08a2b661131f8c5400bcc18ff54df6a2a8cd4a10ce3dfbbf992cf04" exitCode=0 Feb 17 14:59:01 crc kubenswrapper[4717]: I0217 14:59:01.455621 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rb6w5" event={"ID":"46c61af5-321a-4195-8d6b-a95c65c5eae3","Type":"ContainerDied","Data":"1d487ff9d08a2b661131f8c5400bcc18ff54df6a2a8cd4a10ce3dfbbf992cf04"} Feb 17 14:59:01 crc kubenswrapper[4717]: I0217 14:59:01.459523 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4kxp" event={"ID":"43eecc72-1d78-474c-b7c3-a773e72acb6a","Type":"ContainerStarted","Data":"c15466b0588db3391e3e0ba9f8392e412fd26ec41cdf42c03cee814ce1d5b6ce"} Feb 17 14:59:01 crc kubenswrapper[4717]: I0217 14:59:01.510036 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b4kxp" podStartSLOduration=3.756121251 podStartE2EDuration="6.510006421s" podCreationTimestamp="2026-02-17 14:58:55 +0000 UTC" firstStartedPulling="2026-02-17 14:58:57.399932359 +0000 UTC m=+403.815772865" lastFinishedPulling="2026-02-17 14:59:00.153817559 +0000 UTC m=+406.569658035" observedRunningTime="2026-02-17 14:59:01.508923909 +0000 UTC m=+407.924764385" watchObservedRunningTime="2026-02-17 14:59:01.510006421 +0000 UTC m=+407.925846897" Feb 17 14:59:02 crc kubenswrapper[4717]: I0217 14:59:02.468806 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb79c" event={"ID":"f2ed0865-16aa-42df-94e8-c228b0baf920","Type":"ContainerStarted","Data":"eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec"} Feb 17 14:59:03 crc kubenswrapper[4717]: I0217 14:59:03.476384 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rb6w5" event={"ID":"46c61af5-321a-4195-8d6b-a95c65c5eae3","Type":"ContainerStarted","Data":"4c2e871b003f926cfe99244eebc8c35ba825bdc696d8c92bec482f5269809bc4"} Feb 17 14:59:03 crc kubenswrapper[4717]: I0217 14:59:03.497259 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jb79c" podStartSLOduration=3.665127306 podStartE2EDuration="6.497241029s" podCreationTimestamp="2026-02-17 14:58:57 +0000 UTC" firstStartedPulling="2026-02-17 14:58:59.406400091 +0000 UTC m=+405.822240567" lastFinishedPulling="2026-02-17 14:59:02.238513814 +0000 UTC m=+408.654354290" observedRunningTime="2026-02-17 14:59:03.4962727 +0000 UTC m=+409.912113176" watchObservedRunningTime="2026-02-17 14:59:03.497241029 +0000 UTC m=+409.913081505" Feb 17 14:59:03 crc kubenswrapper[4717]: I0217 14:59:03.522385 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rb6w5" podStartSLOduration=3.140677522 podStartE2EDuration="6.522366165s" podCreationTimestamp="2026-02-17 14:58:57 +0000 UTC" firstStartedPulling="2026-02-17 14:58:59.411599853 +0000 UTC m=+405.827440329" lastFinishedPulling="2026-02-17 14:59:02.793288486 +0000 UTC m=+409.209128972" observedRunningTime="2026-02-17 14:59:03.517838232 +0000 UTC m=+409.933678708" watchObservedRunningTime="2026-02-17 14:59:03.522366165 +0000 UTC m=+409.938206631" Feb 17 14:59:05 crc kubenswrapper[4717]: I0217 14:59:05.322424 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:59:05 crc kubenswrapper[4717]: I0217 14:59:05.323157 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:59:05 crc kubenswrapper[4717]: I0217 14:59:05.380818 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:59:05 crc kubenswrapper[4717]: I0217 14:59:05.529457 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jk5wb" Feb 17 14:59:05 crc kubenswrapper[4717]: I0217 14:59:05.531642 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:59:05 crc kubenswrapper[4717]: I0217 14:59:05.531702 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:59:05 crc kubenswrapper[4717]: I0217 14:59:05.585310 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:59:06 crc kubenswrapper[4717]: I0217 14:59:06.542199 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b4kxp" Feb 17 14:59:07 crc kubenswrapper[4717]: I0217 14:59:07.731928 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:59:07 crc kubenswrapper[4717]: I0217 14:59:07.732501 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:59:07 crc kubenswrapper[4717]: I0217 14:59:07.921315 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:59:07 crc kubenswrapper[4717]: I0217 14:59:07.921398 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:59:07 crc kubenswrapper[4717]: I0217 14:59:07.960807 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:59:08 crc kubenswrapper[4717]: I0217 14:59:08.556583 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rb6w5" Feb 17 14:59:08 crc kubenswrapper[4717]: I0217 14:59:08.773043 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jb79c" podUID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerName="registry-server" probeResult="failure" output=< Feb 17 14:59:08 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 17 14:59:08 crc kubenswrapper[4717]: > Feb 17 14:59:17 crc kubenswrapper[4717]: I0217 14:59:17.776431 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:59:17 crc kubenswrapper[4717]: I0217 14:59:17.840261 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 14:59:20 crc kubenswrapper[4717]: I0217 14:59:20.808873 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:59:20 crc kubenswrapper[4717]: I0217 14:59:20.809330 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:59:20 crc kubenswrapper[4717]: I0217 14:59:20.809496 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 14:59:20 crc kubenswrapper[4717]: I0217 14:59:20.812476 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4beac6debb80927cd51e51b507515d3c5694b27b3026cda68a7eb6f2c64a5fbd"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:59:20 crc kubenswrapper[4717]: I0217 14:59:20.814496 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://4beac6debb80927cd51e51b507515d3c5694b27b3026cda68a7eb6f2c64a5fbd" gracePeriod=600 Feb 17 14:59:21 crc kubenswrapper[4717]: I0217 14:59:21.600387 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="4beac6debb80927cd51e51b507515d3c5694b27b3026cda68a7eb6f2c64a5fbd" exitCode=0 Feb 17 14:59:21 crc kubenswrapper[4717]: I0217 14:59:21.600492 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"4beac6debb80927cd51e51b507515d3c5694b27b3026cda68a7eb6f2c64a5fbd"} Feb 17 14:59:21 crc kubenswrapper[4717]: I0217 14:59:21.601194 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"2be4d6f9ba1b351a5eaa6776c534f8fa0de4b5afbe4f304b46d3b99ccaa26b45"} Feb 17 14:59:21 crc kubenswrapper[4717]: I0217 14:59:21.601226 4717 scope.go:117] "RemoveContainer" containerID="e417a02e32f12213d874fbc49bd4db6d523ae443a25b57cec2f320aa9bbbc079" Feb 17 14:59:21 crc kubenswrapper[4717]: I0217 14:59:21.760699 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" podUID="9d4c20ae-0163-4deb-b965-5e3f7193d9e4" containerName="registry" containerID="cri-o://3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1" gracePeriod=30 Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.340622 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.429805 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-installation-pull-secrets\") pod \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.429893 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-bound-sa-token\") pod \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.430253 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.430311 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cfv5\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-kube-api-access-7cfv5\") pod \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.430392 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-ca-trust-extracted\") pod \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.430432 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-trusted-ca\") pod \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.430494 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-registry-tls\") pod \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.430535 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-registry-certificates\") pod \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\" (UID: \"9d4c20ae-0163-4deb-b965-5e3f7193d9e4\") " Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.431581 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4c20ae-0163-4deb-b965-5e3f7193d9e4" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.431982 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9d4c20ae-0163-4deb-b965-5e3f7193d9e4" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.432326 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.432352 4717 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.438383 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9d4c20ae-0163-4deb-b965-5e3f7193d9e4" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.447420 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-kube-api-access-7cfv5" (OuterVolumeSpecName: "kube-api-access-7cfv5") pod "9d4c20ae-0163-4deb-b965-5e3f7193d9e4" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4"). InnerVolumeSpecName "kube-api-access-7cfv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.447500 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9d4c20ae-0163-4deb-b965-5e3f7193d9e4" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.447603 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9d4c20ae-0163-4deb-b965-5e3f7193d9e4" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.451224 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9d4c20ae-0163-4deb-b965-5e3f7193d9e4" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.462826 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9d4c20ae-0163-4deb-b965-5e3f7193d9e4" (UID: "9d4c20ae-0163-4deb-b965-5e3f7193d9e4"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.533589 4717 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.533655 4717 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.533670 4717 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.533688 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.533700 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cfv5\" (UniqueName: \"kubernetes.io/projected/9d4c20ae-0163-4deb-b965-5e3f7193d9e4-kube-api-access-7cfv5\") on node \"crc\" DevicePath \"\"" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.634271 4717 generic.go:334] "Generic (PLEG): container finished" podID="9d4c20ae-0163-4deb-b965-5e3f7193d9e4" containerID="3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1" exitCode=0 Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.634508 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.634486 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" event={"ID":"9d4c20ae-0163-4deb-b965-5e3f7193d9e4","Type":"ContainerDied","Data":"3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1"} Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.634568 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mvgrd" event={"ID":"9d4c20ae-0163-4deb-b965-5e3f7193d9e4","Type":"ContainerDied","Data":"e69fdc03dc95f9838adf2eccaab026dffe9025feeaa658a1fb7778f3cb76e1a4"} Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.634596 4717 scope.go:117] "RemoveContainer" containerID="3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.662406 4717 scope.go:117] "RemoveContainer" containerID="3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1" Feb 17 14:59:22 crc kubenswrapper[4717]: E0217 14:59:22.663482 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1\": container with ID starting with 3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1 not found: ID does not exist" containerID="3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.663544 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1"} err="failed to get container status \"3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1\": rpc error: code = NotFound desc = could not find container \"3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1\": container with ID starting with 3f6ce4511116bd06bb069a230b12670a36896dfaa0cfa0b1e238c2b6eb1aeda1 not found: ID does not exist" Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.670376 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mvgrd"] Feb 17 14:59:22 crc kubenswrapper[4717]: I0217 14:59:22.683829 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mvgrd"] Feb 17 14:59:23 crc kubenswrapper[4717]: I0217 14:59:23.855240 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4c20ae-0163-4deb-b965-5e3f7193d9e4" path="/var/lib/kubelet/pods/9d4c20ae-0163-4deb-b965-5e3f7193d9e4/volumes" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.216655 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp"] Feb 17 15:00:00 crc kubenswrapper[4717]: E0217 15:00:00.217557 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4c20ae-0163-4deb-b965-5e3f7193d9e4" containerName="registry" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.217578 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4c20ae-0163-4deb-b965-5e3f7193d9e4" containerName="registry" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.217727 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d4c20ae-0163-4deb-b965-5e3f7193d9e4" containerName="registry" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.218231 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.224824 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.227435 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp"] Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.229707 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.311758 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d67da46-54ca-4904-93f0-e2d985a9cedc-secret-volume\") pod \"collect-profiles-29522340-vsdzp\" (UID: \"0d67da46-54ca-4904-93f0-e2d985a9cedc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.311833 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d67da46-54ca-4904-93f0-e2d985a9cedc-config-volume\") pod \"collect-profiles-29522340-vsdzp\" (UID: \"0d67da46-54ca-4904-93f0-e2d985a9cedc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.311883 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz46z\" (UniqueName: \"kubernetes.io/projected/0d67da46-54ca-4904-93f0-e2d985a9cedc-kube-api-access-kz46z\") pod \"collect-profiles-29522340-vsdzp\" (UID: \"0d67da46-54ca-4904-93f0-e2d985a9cedc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.414007 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d67da46-54ca-4904-93f0-e2d985a9cedc-secret-volume\") pod \"collect-profiles-29522340-vsdzp\" (UID: \"0d67da46-54ca-4904-93f0-e2d985a9cedc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.414074 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d67da46-54ca-4904-93f0-e2d985a9cedc-config-volume\") pod \"collect-profiles-29522340-vsdzp\" (UID: \"0d67da46-54ca-4904-93f0-e2d985a9cedc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.414176 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz46z\" (UniqueName: \"kubernetes.io/projected/0d67da46-54ca-4904-93f0-e2d985a9cedc-kube-api-access-kz46z\") pod \"collect-profiles-29522340-vsdzp\" (UID: \"0d67da46-54ca-4904-93f0-e2d985a9cedc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.415909 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d67da46-54ca-4904-93f0-e2d985a9cedc-config-volume\") pod \"collect-profiles-29522340-vsdzp\" (UID: \"0d67da46-54ca-4904-93f0-e2d985a9cedc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.424594 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d67da46-54ca-4904-93f0-e2d985a9cedc-secret-volume\") pod \"collect-profiles-29522340-vsdzp\" (UID: \"0d67da46-54ca-4904-93f0-e2d985a9cedc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.442866 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz46z\" (UniqueName: \"kubernetes.io/projected/0d67da46-54ca-4904-93f0-e2d985a9cedc-kube-api-access-kz46z\") pod \"collect-profiles-29522340-vsdzp\" (UID: \"0d67da46-54ca-4904-93f0-e2d985a9cedc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.542885 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:00 crc kubenswrapper[4717]: I0217 15:00:00.807006 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp"] Feb 17 15:00:01 crc kubenswrapper[4717]: I0217 15:00:01.075045 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" event={"ID":"0d67da46-54ca-4904-93f0-e2d985a9cedc","Type":"ContainerStarted","Data":"1182285e21d67d70e1e882b580603d358ec356c4b4e338b5a91afd30ececa508"} Feb 17 15:00:01 crc kubenswrapper[4717]: I0217 15:00:01.075124 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" event={"ID":"0d67da46-54ca-4904-93f0-e2d985a9cedc","Type":"ContainerStarted","Data":"009b2df0d922e0055626a66fdce30c30207b8abae6efd23c4b8b2d048f32fa30"} Feb 17 15:00:01 crc kubenswrapper[4717]: I0217 15:00:01.097729 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" podStartSLOduration=1.097699202 podStartE2EDuration="1.097699202s" podCreationTimestamp="2026-02-17 15:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:00:01.096987232 +0000 UTC m=+467.512827728" watchObservedRunningTime="2026-02-17 15:00:01.097699202 +0000 UTC m=+467.513539688" Feb 17 15:00:02 crc kubenswrapper[4717]: I0217 15:00:02.086382 4717 generic.go:334] "Generic (PLEG): container finished" podID="0d67da46-54ca-4904-93f0-e2d985a9cedc" containerID="1182285e21d67d70e1e882b580603d358ec356c4b4e338b5a91afd30ececa508" exitCode=0 Feb 17 15:00:02 crc kubenswrapper[4717]: I0217 15:00:02.086926 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" event={"ID":"0d67da46-54ca-4904-93f0-e2d985a9cedc","Type":"ContainerDied","Data":"1182285e21d67d70e1e882b580603d358ec356c4b4e338b5a91afd30ececa508"} Feb 17 15:00:03 crc kubenswrapper[4717]: I0217 15:00:03.486545 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:03 crc kubenswrapper[4717]: I0217 15:00:03.565900 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz46z\" (UniqueName: \"kubernetes.io/projected/0d67da46-54ca-4904-93f0-e2d985a9cedc-kube-api-access-kz46z\") pod \"0d67da46-54ca-4904-93f0-e2d985a9cedc\" (UID: \"0d67da46-54ca-4904-93f0-e2d985a9cedc\") " Feb 17 15:00:03 crc kubenswrapper[4717]: I0217 15:00:03.565969 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d67da46-54ca-4904-93f0-e2d985a9cedc-secret-volume\") pod \"0d67da46-54ca-4904-93f0-e2d985a9cedc\" (UID: \"0d67da46-54ca-4904-93f0-e2d985a9cedc\") " Feb 17 15:00:03 crc kubenswrapper[4717]: I0217 15:00:03.566072 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d67da46-54ca-4904-93f0-e2d985a9cedc-config-volume\") pod \"0d67da46-54ca-4904-93f0-e2d985a9cedc\" (UID: \"0d67da46-54ca-4904-93f0-e2d985a9cedc\") " Feb 17 15:00:03 crc kubenswrapper[4717]: I0217 15:00:03.568260 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d67da46-54ca-4904-93f0-e2d985a9cedc-config-volume" (OuterVolumeSpecName: "config-volume") pod "0d67da46-54ca-4904-93f0-e2d985a9cedc" (UID: "0d67da46-54ca-4904-93f0-e2d985a9cedc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:00:03 crc kubenswrapper[4717]: I0217 15:00:03.573431 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d67da46-54ca-4904-93f0-e2d985a9cedc-kube-api-access-kz46z" (OuterVolumeSpecName: "kube-api-access-kz46z") pod "0d67da46-54ca-4904-93f0-e2d985a9cedc" (UID: "0d67da46-54ca-4904-93f0-e2d985a9cedc"). InnerVolumeSpecName "kube-api-access-kz46z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:00:03 crc kubenswrapper[4717]: I0217 15:00:03.577368 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d67da46-54ca-4904-93f0-e2d985a9cedc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0d67da46-54ca-4904-93f0-e2d985a9cedc" (UID: "0d67da46-54ca-4904-93f0-e2d985a9cedc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:00:03 crc kubenswrapper[4717]: I0217 15:00:03.668189 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d67da46-54ca-4904-93f0-e2d985a9cedc-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:00:03 crc kubenswrapper[4717]: I0217 15:00:03.668259 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz46z\" (UniqueName: \"kubernetes.io/projected/0d67da46-54ca-4904-93f0-e2d985a9cedc-kube-api-access-kz46z\") on node \"crc\" DevicePath \"\"" Feb 17 15:00:03 crc kubenswrapper[4717]: I0217 15:00:03.668305 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d67da46-54ca-4904-93f0-e2d985a9cedc-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:00:04 crc kubenswrapper[4717]: I0217 15:00:04.103500 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" event={"ID":"0d67da46-54ca-4904-93f0-e2d985a9cedc","Type":"ContainerDied","Data":"009b2df0d922e0055626a66fdce30c30207b8abae6efd23c4b8b2d048f32fa30"} Feb 17 15:00:04 crc kubenswrapper[4717]: I0217 15:00:04.103559 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp" Feb 17 15:00:04 crc kubenswrapper[4717]: I0217 15:00:04.103574 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="009b2df0d922e0055626a66fdce30c30207b8abae6efd23c4b8b2d048f32fa30" Feb 17 15:01:16 crc kubenswrapper[4717]: I0217 15:01:16.247739 4717 scope.go:117] "RemoveContainer" containerID="1ae9fdbb731c2953e56022220c346e672a98c50f0462a9d51a6b7bbcec0007b5" Feb 17 15:01:50 crc kubenswrapper[4717]: I0217 15:01:50.808198 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:01:50 crc kubenswrapper[4717]: I0217 15:01:50.808934 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:02:16 crc kubenswrapper[4717]: I0217 15:02:16.281435 4717 scope.go:117] "RemoveContainer" containerID="ad70588b0505bef3caea0719c43968399fce8016d113a726540d0388cf50e5e7" Feb 17 15:02:16 crc kubenswrapper[4717]: I0217 15:02:16.302705 4717 scope.go:117] "RemoveContainer" containerID="b8edb95cc7b4f5d69fa83e8b8ad283e5c008bfa14cc054de1408f439b119dea9" Feb 17 15:02:20 crc kubenswrapper[4717]: I0217 15:02:20.808766 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:02:20 crc kubenswrapper[4717]: I0217 15:02:20.809191 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:02:50 crc kubenswrapper[4717]: I0217 15:02:50.809395 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:02:50 crc kubenswrapper[4717]: I0217 15:02:50.810437 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:02:50 crc kubenswrapper[4717]: I0217 15:02:50.810535 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 15:02:50 crc kubenswrapper[4717]: I0217 15:02:50.811837 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2be4d6f9ba1b351a5eaa6776c534f8fa0de4b5afbe4f304b46d3b99ccaa26b45"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:02:50 crc kubenswrapper[4717]: I0217 15:02:50.811981 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://2be4d6f9ba1b351a5eaa6776c534f8fa0de4b5afbe4f304b46d3b99ccaa26b45" gracePeriod=600 Feb 17 15:02:51 crc kubenswrapper[4717]: I0217 15:02:51.303909 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="2be4d6f9ba1b351a5eaa6776c534f8fa0de4b5afbe4f304b46d3b99ccaa26b45" exitCode=0 Feb 17 15:02:51 crc kubenswrapper[4717]: I0217 15:02:51.304005 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"2be4d6f9ba1b351a5eaa6776c534f8fa0de4b5afbe4f304b46d3b99ccaa26b45"} Feb 17 15:02:51 crc kubenswrapper[4717]: I0217 15:02:51.304567 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"a6cb2d452429431b113e7eb8a0c1c4fb59dfbfa50aec6a5702cb9844fb01cb5c"} Feb 17 15:02:51 crc kubenswrapper[4717]: I0217 15:02:51.304612 4717 scope.go:117] "RemoveContainer" containerID="4beac6debb80927cd51e51b507515d3c5694b27b3026cda68a7eb6f2c64a5fbd" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.017458 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n2cmk"] Feb 17 15:04:12 crc kubenswrapper[4717]: E0217 15:04:12.018268 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d67da46-54ca-4904-93f0-e2d985a9cedc" containerName="collect-profiles" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.018284 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d67da46-54ca-4904-93f0-e2d985a9cedc" containerName="collect-profiles" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.018401 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d67da46-54ca-4904-93f0-e2d985a9cedc" containerName="collect-profiles" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.018893 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2cmk" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.023038 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.023349 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-prb99" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.023743 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.031981 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n2cmk"] Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.034576 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6m2s\" (UniqueName: \"kubernetes.io/projected/52e5e91c-8239-4ffb-bc9a-a5dc8780f6e3-kube-api-access-b6m2s\") pod \"cert-manager-cainjector-cf98fcc89-n2cmk\" (UID: \"52e5e91c-8239-4ffb-bc9a-a5dc8780f6e3\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2cmk" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.042994 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-kfbxb"] Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.044071 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kfbxb" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.045871 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8w6xj" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.070265 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-kfbxb"] Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.076694 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-n82c5"] Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.077794 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-n82c5" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.080471 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-2hmx7" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.084946 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-n82c5"] Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.135724 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dgc6\" (UniqueName: \"kubernetes.io/projected/448f5725-7b22-4ceb-8c7f-f1e8883d423b-kube-api-access-5dgc6\") pod \"cert-manager-858654f9db-kfbxb\" (UID: \"448f5725-7b22-4ceb-8c7f-f1e8883d423b\") " pod="cert-manager/cert-manager-858654f9db-kfbxb" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.135792 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6m2s\" (UniqueName: \"kubernetes.io/projected/52e5e91c-8239-4ffb-bc9a-a5dc8780f6e3-kube-api-access-b6m2s\") pod \"cert-manager-cainjector-cf98fcc89-n2cmk\" (UID: \"52e5e91c-8239-4ffb-bc9a-a5dc8780f6e3\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2cmk" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.135925 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqknk\" (UniqueName: \"kubernetes.io/projected/04796f6f-6755-4e30-9fec-7003006dd113-kube-api-access-pqknk\") pod \"cert-manager-webhook-687f57d79b-n82c5\" (UID: \"04796f6f-6755-4e30-9fec-7003006dd113\") " pod="cert-manager/cert-manager-webhook-687f57d79b-n82c5" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.166655 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6m2s\" (UniqueName: \"kubernetes.io/projected/52e5e91c-8239-4ffb-bc9a-a5dc8780f6e3-kube-api-access-b6m2s\") pod \"cert-manager-cainjector-cf98fcc89-n2cmk\" (UID: \"52e5e91c-8239-4ffb-bc9a-a5dc8780f6e3\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2cmk" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.236919 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqknk\" (UniqueName: \"kubernetes.io/projected/04796f6f-6755-4e30-9fec-7003006dd113-kube-api-access-pqknk\") pod \"cert-manager-webhook-687f57d79b-n82c5\" (UID: \"04796f6f-6755-4e30-9fec-7003006dd113\") " pod="cert-manager/cert-manager-webhook-687f57d79b-n82c5" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.237038 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dgc6\" (UniqueName: \"kubernetes.io/projected/448f5725-7b22-4ceb-8c7f-f1e8883d423b-kube-api-access-5dgc6\") pod \"cert-manager-858654f9db-kfbxb\" (UID: \"448f5725-7b22-4ceb-8c7f-f1e8883d423b\") " pod="cert-manager/cert-manager-858654f9db-kfbxb" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.259768 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dgc6\" (UniqueName: \"kubernetes.io/projected/448f5725-7b22-4ceb-8c7f-f1e8883d423b-kube-api-access-5dgc6\") pod \"cert-manager-858654f9db-kfbxb\" (UID: \"448f5725-7b22-4ceb-8c7f-f1e8883d423b\") " pod="cert-manager/cert-manager-858654f9db-kfbxb" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.262158 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqknk\" (UniqueName: \"kubernetes.io/projected/04796f6f-6755-4e30-9fec-7003006dd113-kube-api-access-pqknk\") pod \"cert-manager-webhook-687f57d79b-n82c5\" (UID: \"04796f6f-6755-4e30-9fec-7003006dd113\") " pod="cert-manager/cert-manager-webhook-687f57d79b-n82c5" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.342319 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2cmk" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.365186 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kfbxb" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.400710 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-n82c5" Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.617599 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n2cmk"] Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.623616 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-kfbxb"] Feb 17 15:04:12 crc kubenswrapper[4717]: W0217 15:04:12.628799 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52e5e91c_8239_4ffb_bc9a_a5dc8780f6e3.slice/crio-0ef80f13cdd041e17d27c3c64f130b0f0eb2b1cc8f481fd058382e59f74d78f6 WatchSource:0}: Error finding container 0ef80f13cdd041e17d27c3c64f130b0f0eb2b1cc8f481fd058382e59f74d78f6: Status 404 returned error can't find the container with id 0ef80f13cdd041e17d27c3c64f130b0f0eb2b1cc8f481fd058382e59f74d78f6 Feb 17 15:04:12 crc kubenswrapper[4717]: W0217 15:04:12.630415 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod448f5725_7b22_4ceb_8c7f_f1e8883d423b.slice/crio-eb6f3142c451f08a93ddfd470600dd2c05485aaee4473f94f23978d1e5e7451e WatchSource:0}: Error finding container eb6f3142c451f08a93ddfd470600dd2c05485aaee4473f94f23978d1e5e7451e: Status 404 returned error can't find the container with id eb6f3142c451f08a93ddfd470600dd2c05485aaee4473f94f23978d1e5e7451e Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.633697 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.934968 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2cmk" event={"ID":"52e5e91c-8239-4ffb-bc9a-a5dc8780f6e3","Type":"ContainerStarted","Data":"0ef80f13cdd041e17d27c3c64f130b0f0eb2b1cc8f481fd058382e59f74d78f6"} Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.936524 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-kfbxb" event={"ID":"448f5725-7b22-4ceb-8c7f-f1e8883d423b","Type":"ContainerStarted","Data":"eb6f3142c451f08a93ddfd470600dd2c05485aaee4473f94f23978d1e5e7451e"} Feb 17 15:04:12 crc kubenswrapper[4717]: I0217 15:04:12.966884 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-n82c5"] Feb 17 15:04:12 crc kubenswrapper[4717]: W0217 15:04:12.972273 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04796f6f_6755_4e30_9fec_7003006dd113.slice/crio-562c07173838c44cdd0dc6f81ae31f1c9e5bf4ce3481493ada38952492b7bb63 WatchSource:0}: Error finding container 562c07173838c44cdd0dc6f81ae31f1c9e5bf4ce3481493ada38952492b7bb63: Status 404 returned error can't find the container with id 562c07173838c44cdd0dc6f81ae31f1c9e5bf4ce3481493ada38952492b7bb63 Feb 17 15:04:13 crc kubenswrapper[4717]: I0217 15:04:13.949498 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-n82c5" event={"ID":"04796f6f-6755-4e30-9fec-7003006dd113","Type":"ContainerStarted","Data":"562c07173838c44cdd0dc6f81ae31f1c9e5bf4ce3481493ada38952492b7bb63"} Feb 17 15:04:16 crc kubenswrapper[4717]: I0217 15:04:16.971999 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-kfbxb" event={"ID":"448f5725-7b22-4ceb-8c7f-f1e8883d423b","Type":"ContainerStarted","Data":"73aa6de9d4257b6b324633073bc82f7e520e40f3e278fbe2741ce2d8c053a045"} Feb 17 15:04:16 crc kubenswrapper[4717]: I0217 15:04:16.973846 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-n82c5" event={"ID":"04796f6f-6755-4e30-9fec-7003006dd113","Type":"ContainerStarted","Data":"9127814599e920930b7b6694931590139128b0edf4966cba42282f32b54645ca"} Feb 17 15:04:16 crc kubenswrapper[4717]: I0217 15:04:16.974135 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-n82c5" Feb 17 15:04:16 crc kubenswrapper[4717]: I0217 15:04:16.975951 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2cmk" event={"ID":"52e5e91c-8239-4ffb-bc9a-a5dc8780f6e3","Type":"ContainerStarted","Data":"b9767cca6400b0919af5b5869b2873df74641466a3fc0939570e7526c8fb517d"} Feb 17 15:04:16 crc kubenswrapper[4717]: I0217 15:04:16.992928 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-kfbxb" podStartSLOduration=1.2486979 podStartE2EDuration="4.992902454s" podCreationTimestamp="2026-02-17 15:04:12 +0000 UTC" firstStartedPulling="2026-02-17 15:04:12.63338095 +0000 UTC m=+719.049221416" lastFinishedPulling="2026-02-17 15:04:16.377585494 +0000 UTC m=+722.793425970" observedRunningTime="2026-02-17 15:04:16.992179153 +0000 UTC m=+723.408019659" watchObservedRunningTime="2026-02-17 15:04:16.992902454 +0000 UTC m=+723.408742930" Feb 17 15:04:17 crc kubenswrapper[4717]: I0217 15:04:17.033698 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2cmk" podStartSLOduration=2.316334981 podStartE2EDuration="6.033674252s" podCreationTimestamp="2026-02-17 15:04:11 +0000 UTC" firstStartedPulling="2026-02-17 15:04:12.636247921 +0000 UTC m=+719.052088397" lastFinishedPulling="2026-02-17 15:04:16.353587152 +0000 UTC m=+722.769427668" observedRunningTime="2026-02-17 15:04:17.029886984 +0000 UTC m=+723.445727480" watchObservedRunningTime="2026-02-17 15:04:17.033674252 +0000 UTC m=+723.449514728" Feb 17 15:04:17 crc kubenswrapper[4717]: I0217 15:04:17.053410 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-n82c5" podStartSLOduration=1.6683766420000001 podStartE2EDuration="5.053387292s" podCreationTimestamp="2026-02-17 15:04:12 +0000 UTC" firstStartedPulling="2026-02-17 15:04:12.975428707 +0000 UTC m=+719.391269233" lastFinishedPulling="2026-02-17 15:04:16.360439407 +0000 UTC m=+722.776279883" observedRunningTime="2026-02-17 15:04:17.051271742 +0000 UTC m=+723.467112218" watchObservedRunningTime="2026-02-17 15:04:17.053387292 +0000 UTC m=+723.469227768" Feb 17 15:04:21 crc kubenswrapper[4717]: I0217 15:04:21.981192 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4f7wr"] Feb 17 15:04:21 crc kubenswrapper[4717]: I0217 15:04:21.989075 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovn-controller" containerID="cri-o://da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce" gracePeriod=30 Feb 17 15:04:21 crc kubenswrapper[4717]: I0217 15:04:21.989159 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="nbdb" containerID="cri-o://1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13" gracePeriod=30 Feb 17 15:04:21 crc kubenswrapper[4717]: I0217 15:04:21.989292 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="northd" containerID="cri-o://d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305" gracePeriod=30 Feb 17 15:04:21 crc kubenswrapper[4717]: I0217 15:04:21.989364 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555" gracePeriod=30 Feb 17 15:04:21 crc kubenswrapper[4717]: I0217 15:04:21.989422 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="kube-rbac-proxy-node" containerID="cri-o://b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06" gracePeriod=30 Feb 17 15:04:21 crc kubenswrapper[4717]: I0217 15:04:21.989478 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovn-acl-logging" containerID="cri-o://59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a" gracePeriod=30 Feb 17 15:04:21 crc kubenswrapper[4717]: I0217 15:04:21.989617 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="sbdb" containerID="cri-o://4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771" gracePeriod=30 Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.027662 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" containerID="cri-o://a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822" gracePeriod=30 Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.323236 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/4.log" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.323953 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/3.log" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.328773 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovn-acl-logging/0.log" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.329454 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovn-controller/0.log" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.330048 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.401453 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-655tp"] Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.401806 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="kubecfg-setup" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.401834 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="kubecfg-setup" Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.401859 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="northd" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.401872 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="northd" Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.401887 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovn-acl-logging" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.401898 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovn-acl-logging" Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.401910 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.401919 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.401932 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.401939 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.401951 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.401960 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.401973 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="kube-rbac-proxy-node" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.401983 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="kube-rbac-proxy-node" Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.402000 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402013 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.402030 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="nbdb" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402038 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="nbdb" Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.402050 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="sbdb" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402058 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="sbdb" Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.402066 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402077 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.402129 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovn-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402137 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovn-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: E0217 15:04:22.402146 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402153 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402273 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402285 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402298 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovn-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402313 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovn-acl-logging" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402328 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="nbdb" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402340 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="sbdb" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402351 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402362 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402371 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="kube-rbac-proxy-node" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402383 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402397 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="northd" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.402725 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerName="ovnkube-controller" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.405997 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-n82c5" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.406242 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502040 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkszq\" (UniqueName: \"kubernetes.io/projected/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-kube-api-access-gkszq\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502131 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovnkube-config\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502208 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502267 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502345 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-openvswitch\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502385 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502698 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-log-socket\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502746 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-systemd-units\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502751 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502774 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-node-log\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502787 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-log-socket" (OuterVolumeSpecName: "log-socket") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502793 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-kubelet\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502808 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502819 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-var-lib-openvswitch\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502828 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-node-log" (OuterVolumeSpecName: "node-log") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502850 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-cni-bin\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502850 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502868 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-etc-openvswitch\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502885 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502894 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-env-overrides\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502912 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502922 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502918 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-run-netns\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502953 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-systemd\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502958 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.502977 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovn-node-metrics-cert\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.503006 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-cni-netd\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.503038 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-ovn\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.503238 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-slash\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.503950 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovnkube-script-lib\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.503974 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-run-ovn-kubernetes\") pod \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\" (UID: \"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6\") " Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.503334 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.503391 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.503415 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.503438 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-slash" (OuterVolumeSpecName: "host-slash") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504048 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-run-netns\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504148 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-kubelet\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504092 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504186 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-cni-netd\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504216 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-etc-openvswitch\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504238 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-cni-bin\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504263 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-log-socket\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504286 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-systemd-units\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504385 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-run-ovn-kubernetes\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504442 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-env-overrides\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504481 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg955\" (UniqueName: \"kubernetes.io/projected/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-kube-api-access-wg955\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504531 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-run-openvswitch\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504555 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504600 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-ovnkube-config\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504622 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-slash\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504658 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-node-log\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504693 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-run-ovn\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504722 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-var-lib-openvswitch\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504739 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-run-systemd\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504780 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-ovn-node-metrics-cert\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504806 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-ovnkube-script-lib\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504873 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504889 4717 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504903 4717 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504913 4717 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-log-socket\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504887 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.504922 4717 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.505016 4717 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-node-log\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.505033 4717 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.505044 4717 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.505053 4717 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.505062 4717 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.505181 4717 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.505217 4717 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.505233 4717 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.505247 4717 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.505263 4717 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-slash\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.505277 4717 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.508039 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.508113 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-kube-api-access-gkszq" (OuterVolumeSpecName: "kube-api-access-gkszq") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "kube-api-access-gkszq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.519036 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" (UID: "c5c8492f-64dc-4b1a-8041-d45d5ebb04f6"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607101 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-ovnkube-config\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607167 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-slash\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607197 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-node-log\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607260 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-run-ovn\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607286 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-var-lib-openvswitch\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607311 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-run-systemd\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607339 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-ovn-node-metrics-cert\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607343 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-node-log\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607346 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-slash\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607362 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-ovnkube-script-lib\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607417 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-var-lib-openvswitch\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607472 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-run-netns\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607513 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-kubelet\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607515 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-run-ovn\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607610 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-run-systemd\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607563 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-cni-netd\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607545 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-cni-netd\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607673 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-etc-openvswitch\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607736 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-cni-bin\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607784 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-systemd-units\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607830 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-log-socket\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607843 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-ovnkube-config\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607577 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-run-netns\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607593 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-kubelet\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607926 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-etc-openvswitch\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.607952 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-cni-bin\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608023 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-ovnkube-script-lib\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608027 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-log-socket\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608033 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-systemd-units\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-run-ovn-kubernetes\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608140 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-env-overrides\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608164 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg955\" (UniqueName: \"kubernetes.io/projected/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-kube-api-access-wg955\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608190 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-run-openvswitch\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608210 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608238 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-run-ovn-kubernetes\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608260 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608284 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608297 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkszq\" (UniqueName: \"kubernetes.io/projected/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-kube-api-access-gkszq\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608315 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-run-openvswitch\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608325 4717 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608353 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.608813 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-env-overrides\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.614945 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-ovn-node-metrics-cert\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.626611 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg955\" (UniqueName: \"kubernetes.io/projected/75bb1f4d-328d-45b4-90f1-0e6bb237beeb-kube-api-access-wg955\") pod \"ovnkube-node-655tp\" (UID: \"75bb1f4d-328d-45b4-90f1-0e6bb237beeb\") " pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:22 crc kubenswrapper[4717]: I0217 15:04:22.722534 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.024525 4717 generic.go:334] "Generic (PLEG): container finished" podID="75bb1f4d-328d-45b4-90f1-0e6bb237beeb" containerID="55ee4d8908b47b80ea46c2ffa32eed6a4d3e1b6bec7520c585dda22b397b68fe" exitCode=0 Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.024601 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" event={"ID":"75bb1f4d-328d-45b4-90f1-0e6bb237beeb","Type":"ContainerDied","Data":"55ee4d8908b47b80ea46c2ffa32eed6a4d3e1b6bec7520c585dda22b397b68fe"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.024631 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" event={"ID":"75bb1f4d-328d-45b4-90f1-0e6bb237beeb","Type":"ContainerStarted","Data":"cf395d3b43da63321fd39a9ea58b61a9d8f12da4b163e5d5cec7db929a5c7bf0"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.026674 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nfvrt_3daa865c-6e58-4512-9be1-5d3a490a2f7a/kube-multus/2.log" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.027452 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nfvrt_3daa865c-6e58-4512-9be1-5d3a490a2f7a/kube-multus/1.log" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.027494 4717 generic.go:334] "Generic (PLEG): container finished" podID="3daa865c-6e58-4512-9be1-5d3a490a2f7a" containerID="a852fe0b82a2d6ba3bce6e311bef1cdc6fdad339fd7922aa1007be30e0774e55" exitCode=2 Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.027567 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nfvrt" event={"ID":"3daa865c-6e58-4512-9be1-5d3a490a2f7a","Type":"ContainerDied","Data":"a852fe0b82a2d6ba3bce6e311bef1cdc6fdad339fd7922aa1007be30e0774e55"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.027607 4717 scope.go:117] "RemoveContainer" containerID="01f9e46aae86320f6c1bcc534e3cbe373ef0df5082d7386e2e382d1c60228ca6" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.028155 4717 scope.go:117] "RemoveContainer" containerID="a852fe0b82a2d6ba3bce6e311bef1cdc6fdad339fd7922aa1007be30e0774e55" Feb 17 15:04:23 crc kubenswrapper[4717]: E0217 15:04:23.028687 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-nfvrt_openshift-multus(3daa865c-6e58-4512-9be1-5d3a490a2f7a)\"" pod="openshift-multus/multus-nfvrt" podUID="3daa865c-6e58-4512-9be1-5d3a490a2f7a" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.030599 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/4.log" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.031252 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovnkube-controller/3.log" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.041974 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovn-acl-logging/0.log" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.042887 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4f7wr_c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/ovn-controller/0.log" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043435 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822" exitCode=2 Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043461 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771" exitCode=0 Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043469 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13" exitCode=0 Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043476 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305" exitCode=0 Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043483 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555" exitCode=0 Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043490 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06" exitCode=0 Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043497 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a" exitCode=143 Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043509 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" containerID="da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce" exitCode=143 Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043539 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043614 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043655 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043672 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043687 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043720 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043736 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043745 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043753 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043765 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043773 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043781 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043791 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043799 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043807 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043819 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043831 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043840 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043848 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043855 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043866 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043873 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043881 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043889 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043896 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043904 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043914 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043925 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043934 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043942 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043949 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043946 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.043957 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045184 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045269 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045286 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045352 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045371 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045450 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f7wr" event={"ID":"c5c8492f-64dc-4b1a-8041-d45d5ebb04f6","Type":"ContainerDied","Data":"26b1f66774dfdd2c7653e701d74aaffe333905b155324c182a3a7ca8c39433e8"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045593 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045615 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045631 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045645 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045659 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045673 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045686 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045700 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045714 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.045728 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b"} Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.056514 4717 scope.go:117] "RemoveContainer" containerID="a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.094711 4717 scope.go:117] "RemoveContainer" containerID="19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.122010 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4f7wr"] Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.124770 4717 scope.go:117] "RemoveContainer" containerID="4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.127967 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4f7wr"] Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.161179 4717 scope.go:117] "RemoveContainer" containerID="1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.177228 4717 scope.go:117] "RemoveContainer" containerID="d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.196039 4717 scope.go:117] "RemoveContainer" containerID="1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.223383 4717 scope.go:117] "RemoveContainer" containerID="b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.262615 4717 scope.go:117] "RemoveContainer" containerID="59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.277920 4717 scope.go:117] "RemoveContainer" containerID="da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.302070 4717 scope.go:117] "RemoveContainer" containerID="c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.322795 4717 scope.go:117] "RemoveContainer" containerID="a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822" Feb 17 15:04:23 crc kubenswrapper[4717]: E0217 15:04:23.323617 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822\": container with ID starting with a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822 not found: ID does not exist" containerID="a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.323667 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822"} err="failed to get container status \"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822\": rpc error: code = NotFound desc = could not find container \"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822\": container with ID starting with a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.323702 4717 scope.go:117] "RemoveContainer" containerID="19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b" Feb 17 15:04:23 crc kubenswrapper[4717]: E0217 15:04:23.324261 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\": container with ID starting with 19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b not found: ID does not exist" containerID="19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.324303 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b"} err="failed to get container status \"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\": rpc error: code = NotFound desc = could not find container \"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\": container with ID starting with 19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.324334 4717 scope.go:117] "RemoveContainer" containerID="4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771" Feb 17 15:04:23 crc kubenswrapper[4717]: E0217 15:04:23.324743 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\": container with ID starting with 4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771 not found: ID does not exist" containerID="4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.324770 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771"} err="failed to get container status \"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\": rpc error: code = NotFound desc = could not find container \"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\": container with ID starting with 4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.324785 4717 scope.go:117] "RemoveContainer" containerID="1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13" Feb 17 15:04:23 crc kubenswrapper[4717]: E0217 15:04:23.325132 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\": container with ID starting with 1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13 not found: ID does not exist" containerID="1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.325160 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13"} err="failed to get container status \"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\": rpc error: code = NotFound desc = could not find container \"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\": container with ID starting with 1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.325178 4717 scope.go:117] "RemoveContainer" containerID="d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305" Feb 17 15:04:23 crc kubenswrapper[4717]: E0217 15:04:23.325453 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\": container with ID starting with d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305 not found: ID does not exist" containerID="d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.325481 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305"} err="failed to get container status \"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\": rpc error: code = NotFound desc = could not find container \"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\": container with ID starting with d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.325497 4717 scope.go:117] "RemoveContainer" containerID="1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555" Feb 17 15:04:23 crc kubenswrapper[4717]: E0217 15:04:23.325830 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\": container with ID starting with 1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555 not found: ID does not exist" containerID="1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.325863 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555"} err="failed to get container status \"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\": rpc error: code = NotFound desc = could not find container \"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\": container with ID starting with 1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.325884 4717 scope.go:117] "RemoveContainer" containerID="b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06" Feb 17 15:04:23 crc kubenswrapper[4717]: E0217 15:04:23.326315 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\": container with ID starting with b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06 not found: ID does not exist" containerID="b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.326350 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06"} err="failed to get container status \"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\": rpc error: code = NotFound desc = could not find container \"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\": container with ID starting with b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.326372 4717 scope.go:117] "RemoveContainer" containerID="59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a" Feb 17 15:04:23 crc kubenswrapper[4717]: E0217 15:04:23.326783 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\": container with ID starting with 59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a not found: ID does not exist" containerID="59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.326815 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a"} err="failed to get container status \"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\": rpc error: code = NotFound desc = could not find container \"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\": container with ID starting with 59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.326833 4717 scope.go:117] "RemoveContainer" containerID="da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce" Feb 17 15:04:23 crc kubenswrapper[4717]: E0217 15:04:23.327161 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\": container with ID starting with da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce not found: ID does not exist" containerID="da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.327192 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce"} err="failed to get container status \"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\": rpc error: code = NotFound desc = could not find container \"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\": container with ID starting with da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.327212 4717 scope.go:117] "RemoveContainer" containerID="c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b" Feb 17 15:04:23 crc kubenswrapper[4717]: E0217 15:04:23.328301 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\": container with ID starting with c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b not found: ID does not exist" containerID="c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.328330 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b"} err="failed to get container status \"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\": rpc error: code = NotFound desc = could not find container \"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\": container with ID starting with c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.328379 4717 scope.go:117] "RemoveContainer" containerID="a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.329023 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822"} err="failed to get container status \"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822\": rpc error: code = NotFound desc = could not find container \"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822\": container with ID starting with a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.329054 4717 scope.go:117] "RemoveContainer" containerID="19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.329467 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b"} err="failed to get container status \"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\": rpc error: code = NotFound desc = could not find container \"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\": container with ID starting with 19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.329518 4717 scope.go:117] "RemoveContainer" containerID="4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.329951 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771"} err="failed to get container status \"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\": rpc error: code = NotFound desc = could not find container \"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\": container with ID starting with 4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.329998 4717 scope.go:117] "RemoveContainer" containerID="1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.330300 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13"} err="failed to get container status \"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\": rpc error: code = NotFound desc = could not find container \"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\": container with ID starting with 1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.330350 4717 scope.go:117] "RemoveContainer" containerID="d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.330763 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305"} err="failed to get container status \"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\": rpc error: code = NotFound desc = could not find container \"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\": container with ID starting with d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.330786 4717 scope.go:117] "RemoveContainer" containerID="1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.331124 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555"} err="failed to get container status \"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\": rpc error: code = NotFound desc = could not find container \"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\": container with ID starting with 1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.331172 4717 scope.go:117] "RemoveContainer" containerID="b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.331668 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06"} err="failed to get container status \"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\": rpc error: code = NotFound desc = could not find container \"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\": container with ID starting with b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.331700 4717 scope.go:117] "RemoveContainer" containerID="59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.332281 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a"} err="failed to get container status \"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\": rpc error: code = NotFound desc = could not find container \"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\": container with ID starting with 59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.332332 4717 scope.go:117] "RemoveContainer" containerID="da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.332679 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce"} err="failed to get container status \"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\": rpc error: code = NotFound desc = could not find container \"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\": container with ID starting with da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.332705 4717 scope.go:117] "RemoveContainer" containerID="c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.332987 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b"} err="failed to get container status \"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\": rpc error: code = NotFound desc = could not find container \"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\": container with ID starting with c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.333015 4717 scope.go:117] "RemoveContainer" containerID="a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.333459 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822"} err="failed to get container status \"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822\": rpc error: code = NotFound desc = could not find container \"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822\": container with ID starting with a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.333485 4717 scope.go:117] "RemoveContainer" containerID="19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.333978 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b"} err="failed to get container status \"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\": rpc error: code = NotFound desc = could not find container \"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\": container with ID starting with 19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.334011 4717 scope.go:117] "RemoveContainer" containerID="4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.334372 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771"} err="failed to get container status \"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\": rpc error: code = NotFound desc = could not find container \"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\": container with ID starting with 4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.334397 4717 scope.go:117] "RemoveContainer" containerID="1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.334769 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13"} err="failed to get container status \"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\": rpc error: code = NotFound desc = could not find container \"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\": container with ID starting with 1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.334793 4717 scope.go:117] "RemoveContainer" containerID="d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.335211 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305"} err="failed to get container status \"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\": rpc error: code = NotFound desc = could not find container \"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\": container with ID starting with d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.335248 4717 scope.go:117] "RemoveContainer" containerID="1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.335795 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555"} err="failed to get container status \"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\": rpc error: code = NotFound desc = could not find container \"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\": container with ID starting with 1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.335824 4717 scope.go:117] "RemoveContainer" containerID="b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.336966 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06"} err="failed to get container status \"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\": rpc error: code = NotFound desc = could not find container \"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\": container with ID starting with b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.337003 4717 scope.go:117] "RemoveContainer" containerID="59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.337978 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a"} err="failed to get container status \"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\": rpc error: code = NotFound desc = could not find container \"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\": container with ID starting with 59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.338051 4717 scope.go:117] "RemoveContainer" containerID="da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.338654 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce"} err="failed to get container status \"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\": rpc error: code = NotFound desc = could not find container \"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\": container with ID starting with da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.338681 4717 scope.go:117] "RemoveContainer" containerID="c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.339055 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b"} err="failed to get container status \"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\": rpc error: code = NotFound desc = could not find container \"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\": container with ID starting with c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.339131 4717 scope.go:117] "RemoveContainer" containerID="a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.339870 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822"} err="failed to get container status \"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822\": rpc error: code = NotFound desc = could not find container \"a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822\": container with ID starting with a55b1749a581e116aa0b8f5aefbd008d36e725ec5c736a387f722bc77db9c822 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.339902 4717 scope.go:117] "RemoveContainer" containerID="19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.340283 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b"} err="failed to get container status \"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\": rpc error: code = NotFound desc = could not find container \"19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b\": container with ID starting with 19b037c54bf6e0ea08c77db6bbc6b4dc30bea24fcf757fd89a1f80645686932b not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.340311 4717 scope.go:117] "RemoveContainer" containerID="4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.340621 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771"} err="failed to get container status \"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\": rpc error: code = NotFound desc = could not find container \"4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771\": container with ID starting with 4c7c5d050e37a6a7c40c42edc653010a10cef67c37265c0f67b3a9b79302c771 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.340646 4717 scope.go:117] "RemoveContainer" containerID="1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.341018 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13"} err="failed to get container status \"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\": rpc error: code = NotFound desc = could not find container \"1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13\": container with ID starting with 1870e257b298d1fd1bb330cd129640bde3bcd0a3ff0193baa583beedf9894e13 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.341047 4717 scope.go:117] "RemoveContainer" containerID="d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.341354 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305"} err="failed to get container status \"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\": rpc error: code = NotFound desc = could not find container \"d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305\": container with ID starting with d7f2ab4ac6de3721770b8f051730cee1411cf8d13163abbafbf8cb2bfebff305 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.341382 4717 scope.go:117] "RemoveContainer" containerID="1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.341810 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555"} err="failed to get container status \"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\": rpc error: code = NotFound desc = could not find container \"1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555\": container with ID starting with 1d040d3b577e3be4199a63109fb6de778f9ba8f1f232fca6a5bc5206cf4e1555 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.341863 4717 scope.go:117] "RemoveContainer" containerID="b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.342226 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06"} err="failed to get container status \"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\": rpc error: code = NotFound desc = could not find container \"b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06\": container with ID starting with b61b7b1550753691ee85f73cb4bc16ae5f7519e0df792debd123952824426c06 not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.342254 4717 scope.go:117] "RemoveContainer" containerID="59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.342622 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a"} err="failed to get container status \"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\": rpc error: code = NotFound desc = could not find container \"59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a\": container with ID starting with 59ddf843cb671c133be45c1a25d8066ef6c49e5004f07dd54e52615b9897069a not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.342685 4717 scope.go:117] "RemoveContainer" containerID="da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.343155 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce"} err="failed to get container status \"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\": rpc error: code = NotFound desc = could not find container \"da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce\": container with ID starting with da97b564e5dd819991b99bcd5ee3382eb2c2bbe3cc3b7d7765a31e883bb9c5ce not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.343187 4717 scope.go:117] "RemoveContainer" containerID="c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.343482 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b"} err="failed to get container status \"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\": rpc error: code = NotFound desc = could not find container \"c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b\": container with ID starting with c9d04c50f992c75a245046d1d311f6b94e307cfef5d85f6139964308ce72976b not found: ID does not exist" Feb 17 15:04:23 crc kubenswrapper[4717]: I0217 15:04:23.855693 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5c8492f-64dc-4b1a-8041-d45d5ebb04f6" path="/var/lib/kubelet/pods/c5c8492f-64dc-4b1a-8041-d45d5ebb04f6/volumes" Feb 17 15:04:24 crc kubenswrapper[4717]: I0217 15:04:24.055717 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" event={"ID":"75bb1f4d-328d-45b4-90f1-0e6bb237beeb","Type":"ContainerStarted","Data":"f6929c354cc9aeeb135faccd6b210cad4b29eb1280c198e4e1819fa1c3fc3a0e"} Feb 17 15:04:24 crc kubenswrapper[4717]: I0217 15:04:24.056340 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" event={"ID":"75bb1f4d-328d-45b4-90f1-0e6bb237beeb","Type":"ContainerStarted","Data":"ead489d4de1f14cf6c4895defe731976cdc46d7b27ada12efeae6f2e28de9bbf"} Feb 17 15:04:24 crc kubenswrapper[4717]: I0217 15:04:24.056375 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" event={"ID":"75bb1f4d-328d-45b4-90f1-0e6bb237beeb","Type":"ContainerStarted","Data":"4eff3e9c204686f07c0b8b59849c32e3cb3fe4abdceae5e212c4b38921bd9e4c"} Feb 17 15:04:24 crc kubenswrapper[4717]: I0217 15:04:24.056394 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" event={"ID":"75bb1f4d-328d-45b4-90f1-0e6bb237beeb","Type":"ContainerStarted","Data":"5e58115e9874b3724fe89d90ab4952ee1f3c6c83515f6067d2b50e36cefcc9a9"} Feb 17 15:04:24 crc kubenswrapper[4717]: I0217 15:04:24.056409 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" event={"ID":"75bb1f4d-328d-45b4-90f1-0e6bb237beeb","Type":"ContainerStarted","Data":"6eacaefcb6fc14752f00e93a8ba504447b2b8d0920e72d8622a975ea2778f5fc"} Feb 17 15:04:24 crc kubenswrapper[4717]: I0217 15:04:24.056424 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" event={"ID":"75bb1f4d-328d-45b4-90f1-0e6bb237beeb","Type":"ContainerStarted","Data":"ada47e36c3243e1965c73fe3c5ad5ee4b61ee64d35eecdbb6d9f3867b107ca70"} Feb 17 15:04:24 crc kubenswrapper[4717]: I0217 15:04:24.057951 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nfvrt_3daa865c-6e58-4512-9be1-5d3a490a2f7a/kube-multus/2.log" Feb 17 15:04:27 crc kubenswrapper[4717]: I0217 15:04:27.079667 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" event={"ID":"75bb1f4d-328d-45b4-90f1-0e6bb237beeb","Type":"ContainerStarted","Data":"2bcb3a5b5c3e20b58dc0b5809bb9af623c2a75a765e8f6ed44932ab250a14b6e"} Feb 17 15:04:29 crc kubenswrapper[4717]: I0217 15:04:29.095922 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" event={"ID":"75bb1f4d-328d-45b4-90f1-0e6bb237beeb","Type":"ContainerStarted","Data":"188cd6c68d4c29535cbc42db38484a2f7bca87dbf5eeb96217edbc19d8eb73ea"} Feb 17 15:04:29 crc kubenswrapper[4717]: I0217 15:04:29.097454 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:29 crc kubenswrapper[4717]: I0217 15:04:29.097472 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:29 crc kubenswrapper[4717]: I0217 15:04:29.097484 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:29 crc kubenswrapper[4717]: I0217 15:04:29.128313 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" podStartSLOduration=7.128296165 podStartE2EDuration="7.128296165s" podCreationTimestamp="2026-02-17 15:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:04:29.124858137 +0000 UTC m=+735.540698633" watchObservedRunningTime="2026-02-17 15:04:29.128296165 +0000 UTC m=+735.544136641" Feb 17 15:04:29 crc kubenswrapper[4717]: I0217 15:04:29.136651 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:29 crc kubenswrapper[4717]: I0217 15:04:29.137821 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:37 crc kubenswrapper[4717]: I0217 15:04:37.846973 4717 scope.go:117] "RemoveContainer" containerID="a852fe0b82a2d6ba3bce6e311bef1cdc6fdad339fd7922aa1007be30e0774e55" Feb 17 15:04:39 crc kubenswrapper[4717]: I0217 15:04:39.168823 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nfvrt_3daa865c-6e58-4512-9be1-5d3a490a2f7a/kube-multus/2.log" Feb 17 15:04:39 crc kubenswrapper[4717]: I0217 15:04:39.169305 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nfvrt" event={"ID":"3daa865c-6e58-4512-9be1-5d3a490a2f7a","Type":"ContainerStarted","Data":"b36297857fd77992b1edb53f2f9428682ee217fad097f5b6b40aa63bd24e9be6"} Feb 17 15:04:52 crc kubenswrapper[4717]: I0217 15:04:52.758061 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-655tp" Feb 17 15:04:55 crc kubenswrapper[4717]: I0217 15:04:55.799809 4717 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.409500 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb"] Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.411811 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.414801 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.431596 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb"] Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.577307 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a5405a7-d00d-43ec-b3c4-ac0e52626876-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb\" (UID: \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.577504 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77v68\" (UniqueName: \"kubernetes.io/projected/3a5405a7-d00d-43ec-b3c4-ac0e52626876-kube-api-access-77v68\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb\" (UID: \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.577616 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a5405a7-d00d-43ec-b3c4-ac0e52626876-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb\" (UID: \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.678616 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a5405a7-d00d-43ec-b3c4-ac0e52626876-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb\" (UID: \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.678809 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a5405a7-d00d-43ec-b3c4-ac0e52626876-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb\" (UID: \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.678906 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77v68\" (UniqueName: \"kubernetes.io/projected/3a5405a7-d00d-43ec-b3c4-ac0e52626876-kube-api-access-77v68\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb\" (UID: \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.679357 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a5405a7-d00d-43ec-b3c4-ac0e52626876-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb\" (UID: \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.679739 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a5405a7-d00d-43ec-b3c4-ac0e52626876-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb\" (UID: \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.709622 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77v68\" (UniqueName: \"kubernetes.io/projected/3a5405a7-d00d-43ec-b3c4-ac0e52626876-kube-api-access-77v68\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb\" (UID: \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.737634 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:00 crc kubenswrapper[4717]: I0217 15:05:00.993714 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb"] Feb 17 15:05:01 crc kubenswrapper[4717]: I0217 15:05:01.362708 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" event={"ID":"3a5405a7-d00d-43ec-b3c4-ac0e52626876","Type":"ContainerStarted","Data":"d1e6b2c633e60654dee59bf821dcacf9edbd5d2359e091b213a55d1f5d469d16"} Feb 17 15:05:01 crc kubenswrapper[4717]: I0217 15:05:01.362786 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" event={"ID":"3a5405a7-d00d-43ec-b3c4-ac0e52626876","Type":"ContainerStarted","Data":"587d1086664839f725cc114b99cd7a80f7cfa51e60efbd70470b9bd1300b12c9"} Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.146832 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mvrcp"] Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.155405 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.160568 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvrcp"] Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.303685 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b4e09d-d9b4-4c76-83fa-d78113e1599e-catalog-content\") pod \"redhat-operators-mvrcp\" (UID: \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\") " pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.303877 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b4e09d-d9b4-4c76-83fa-d78113e1599e-utilities\") pod \"redhat-operators-mvrcp\" (UID: \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\") " pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.303922 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpt8r\" (UniqueName: \"kubernetes.io/projected/07b4e09d-d9b4-4c76-83fa-d78113e1599e-kube-api-access-mpt8r\") pod \"redhat-operators-mvrcp\" (UID: \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\") " pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.370406 4717 generic.go:334] "Generic (PLEG): container finished" podID="3a5405a7-d00d-43ec-b3c4-ac0e52626876" containerID="d1e6b2c633e60654dee59bf821dcacf9edbd5d2359e091b213a55d1f5d469d16" exitCode=0 Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.370463 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" event={"ID":"3a5405a7-d00d-43ec-b3c4-ac0e52626876","Type":"ContainerDied","Data":"d1e6b2c633e60654dee59bf821dcacf9edbd5d2359e091b213a55d1f5d469d16"} Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.405189 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b4e09d-d9b4-4c76-83fa-d78113e1599e-utilities\") pod \"redhat-operators-mvrcp\" (UID: \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\") " pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.405257 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpt8r\" (UniqueName: \"kubernetes.io/projected/07b4e09d-d9b4-4c76-83fa-d78113e1599e-kube-api-access-mpt8r\") pod \"redhat-operators-mvrcp\" (UID: \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\") " pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.405290 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b4e09d-d9b4-4c76-83fa-d78113e1599e-catalog-content\") pod \"redhat-operators-mvrcp\" (UID: \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\") " pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.405918 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b4e09d-d9b4-4c76-83fa-d78113e1599e-utilities\") pod \"redhat-operators-mvrcp\" (UID: \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\") " pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.405980 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b4e09d-d9b4-4c76-83fa-d78113e1599e-catalog-content\") pod \"redhat-operators-mvrcp\" (UID: \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\") " pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.443425 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpt8r\" (UniqueName: \"kubernetes.io/projected/07b4e09d-d9b4-4c76-83fa-d78113e1599e-kube-api-access-mpt8r\") pod \"redhat-operators-mvrcp\" (UID: \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\") " pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.480821 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:02 crc kubenswrapper[4717]: I0217 15:05:02.908114 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvrcp"] Feb 17 15:05:02 crc kubenswrapper[4717]: W0217 15:05:02.915011 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07b4e09d_d9b4_4c76_83fa_d78113e1599e.slice/crio-83fb04d96dd413f202e2b92024cf7f4860a01bed4be1da1047b41bf1bc5eb2b9 WatchSource:0}: Error finding container 83fb04d96dd413f202e2b92024cf7f4860a01bed4be1da1047b41bf1bc5eb2b9: Status 404 returned error can't find the container with id 83fb04d96dd413f202e2b92024cf7f4860a01bed4be1da1047b41bf1bc5eb2b9 Feb 17 15:05:03 crc kubenswrapper[4717]: I0217 15:05:03.377706 4717 generic.go:334] "Generic (PLEG): container finished" podID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" containerID="4203c6299ef23873ae3cc5e121bcd0af47a78328245bcac721d913a272b50ea1" exitCode=0 Feb 17 15:05:03 crc kubenswrapper[4717]: I0217 15:05:03.377765 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvrcp" event={"ID":"07b4e09d-d9b4-4c76-83fa-d78113e1599e","Type":"ContainerDied","Data":"4203c6299ef23873ae3cc5e121bcd0af47a78328245bcac721d913a272b50ea1"} Feb 17 15:05:03 crc kubenswrapper[4717]: I0217 15:05:03.377796 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvrcp" event={"ID":"07b4e09d-d9b4-4c76-83fa-d78113e1599e","Type":"ContainerStarted","Data":"83fb04d96dd413f202e2b92024cf7f4860a01bed4be1da1047b41bf1bc5eb2b9"} Feb 17 15:05:04 crc kubenswrapper[4717]: I0217 15:05:04.387412 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvrcp" event={"ID":"07b4e09d-d9b4-4c76-83fa-d78113e1599e","Type":"ContainerStarted","Data":"8513caf3012b5eb252f0b0ad01042d8cabeae62b5df21734ca13ce7d4f123dc4"} Feb 17 15:05:05 crc kubenswrapper[4717]: I0217 15:05:05.401963 4717 generic.go:334] "Generic (PLEG): container finished" podID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" containerID="8513caf3012b5eb252f0b0ad01042d8cabeae62b5df21734ca13ce7d4f123dc4" exitCode=0 Feb 17 15:05:05 crc kubenswrapper[4717]: I0217 15:05:05.402019 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvrcp" event={"ID":"07b4e09d-d9b4-4c76-83fa-d78113e1599e","Type":"ContainerDied","Data":"8513caf3012b5eb252f0b0ad01042d8cabeae62b5df21734ca13ce7d4f123dc4"} Feb 17 15:05:06 crc kubenswrapper[4717]: I0217 15:05:06.410427 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvrcp" event={"ID":"07b4e09d-d9b4-4c76-83fa-d78113e1599e","Type":"ContainerStarted","Data":"3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0"} Feb 17 15:05:06 crc kubenswrapper[4717]: I0217 15:05:06.412704 4717 generic.go:334] "Generic (PLEG): container finished" podID="3a5405a7-d00d-43ec-b3c4-ac0e52626876" containerID="2dccd70be7875fc8926c5c29966092f5bf0fbb6c6465b917bd86fff99dcff5b6" exitCode=0 Feb 17 15:05:06 crc kubenswrapper[4717]: I0217 15:05:06.412750 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" event={"ID":"3a5405a7-d00d-43ec-b3c4-ac0e52626876","Type":"ContainerDied","Data":"2dccd70be7875fc8926c5c29966092f5bf0fbb6c6465b917bd86fff99dcff5b6"} Feb 17 15:05:06 crc kubenswrapper[4717]: I0217 15:05:06.439908 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mvrcp" podStartSLOduration=1.9434613490000001 podStartE2EDuration="4.439875136s" podCreationTimestamp="2026-02-17 15:05:02 +0000 UTC" firstStartedPulling="2026-02-17 15:05:03.380127696 +0000 UTC m=+769.795968182" lastFinishedPulling="2026-02-17 15:05:05.876541453 +0000 UTC m=+772.292381969" observedRunningTime="2026-02-17 15:05:06.438322162 +0000 UTC m=+772.854162638" watchObservedRunningTime="2026-02-17 15:05:06.439875136 +0000 UTC m=+772.855715622" Feb 17 15:05:07 crc kubenswrapper[4717]: I0217 15:05:07.423369 4717 generic.go:334] "Generic (PLEG): container finished" podID="3a5405a7-d00d-43ec-b3c4-ac0e52626876" containerID="557a2e17c2bd0bf59296bb23133ccbec0690f435e8fac88603cdc8eef20e02ce" exitCode=0 Feb 17 15:05:07 crc kubenswrapper[4717]: I0217 15:05:07.423480 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" event={"ID":"3a5405a7-d00d-43ec-b3c4-ac0e52626876","Type":"ContainerDied","Data":"557a2e17c2bd0bf59296bb23133ccbec0690f435e8fac88603cdc8eef20e02ce"} Feb 17 15:05:08 crc kubenswrapper[4717]: I0217 15:05:08.704065 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:08 crc kubenswrapper[4717]: I0217 15:05:08.799869 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a5405a7-d00d-43ec-b3c4-ac0e52626876-util\") pod \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\" (UID: \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\") " Feb 17 15:05:08 crc kubenswrapper[4717]: I0217 15:05:08.799936 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77v68\" (UniqueName: \"kubernetes.io/projected/3a5405a7-d00d-43ec-b3c4-ac0e52626876-kube-api-access-77v68\") pod \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\" (UID: \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\") " Feb 17 15:05:08 crc kubenswrapper[4717]: I0217 15:05:08.799964 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a5405a7-d00d-43ec-b3c4-ac0e52626876-bundle\") pod \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\" (UID: \"3a5405a7-d00d-43ec-b3c4-ac0e52626876\") " Feb 17 15:05:08 crc kubenswrapper[4717]: I0217 15:05:08.801021 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a5405a7-d00d-43ec-b3c4-ac0e52626876-bundle" (OuterVolumeSpecName: "bundle") pod "3a5405a7-d00d-43ec-b3c4-ac0e52626876" (UID: "3a5405a7-d00d-43ec-b3c4-ac0e52626876"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:05:08 crc kubenswrapper[4717]: I0217 15:05:08.811677 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a5405a7-d00d-43ec-b3c4-ac0e52626876-util" (OuterVolumeSpecName: "util") pod "3a5405a7-d00d-43ec-b3c4-ac0e52626876" (UID: "3a5405a7-d00d-43ec-b3c4-ac0e52626876"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:05:08 crc kubenswrapper[4717]: I0217 15:05:08.812300 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a5405a7-d00d-43ec-b3c4-ac0e52626876-kube-api-access-77v68" (OuterVolumeSpecName: "kube-api-access-77v68") pod "3a5405a7-d00d-43ec-b3c4-ac0e52626876" (UID: "3a5405a7-d00d-43ec-b3c4-ac0e52626876"). InnerVolumeSpecName "kube-api-access-77v68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:05:08 crc kubenswrapper[4717]: I0217 15:05:08.901658 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a5405a7-d00d-43ec-b3c4-ac0e52626876-util\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:08 crc kubenswrapper[4717]: I0217 15:05:08.901724 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77v68\" (UniqueName: \"kubernetes.io/projected/3a5405a7-d00d-43ec-b3c4-ac0e52626876-kube-api-access-77v68\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:08 crc kubenswrapper[4717]: I0217 15:05:08.901778 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a5405a7-d00d-43ec-b3c4-ac0e52626876-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:09 crc kubenswrapper[4717]: I0217 15:05:09.439769 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" event={"ID":"3a5405a7-d00d-43ec-b3c4-ac0e52626876","Type":"ContainerDied","Data":"587d1086664839f725cc114b99cd7a80f7cfa51e60efbd70470b9bd1300b12c9"} Feb 17 15:05:09 crc kubenswrapper[4717]: I0217 15:05:09.440461 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="587d1086664839f725cc114b99cd7a80f7cfa51e60efbd70470b9bd1300b12c9" Feb 17 15:05:09 crc kubenswrapper[4717]: I0217 15:05:09.439843 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.058046 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-xgp5x"] Feb 17 15:05:11 crc kubenswrapper[4717]: E0217 15:05:11.058323 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5405a7-d00d-43ec-b3c4-ac0e52626876" containerName="pull" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.058338 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5405a7-d00d-43ec-b3c4-ac0e52626876" containerName="pull" Feb 17 15:05:11 crc kubenswrapper[4717]: E0217 15:05:11.058359 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5405a7-d00d-43ec-b3c4-ac0e52626876" containerName="extract" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.058368 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5405a7-d00d-43ec-b3c4-ac0e52626876" containerName="extract" Feb 17 15:05:11 crc kubenswrapper[4717]: E0217 15:05:11.058384 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5405a7-d00d-43ec-b3c4-ac0e52626876" containerName="util" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.058393 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5405a7-d00d-43ec-b3c4-ac0e52626876" containerName="util" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.058543 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5405a7-d00d-43ec-b3c4-ac0e52626876" containerName="extract" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.059033 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-xgp5x" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.061902 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-kd7cm" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.062037 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.062141 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.082401 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-xgp5x"] Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.234535 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5bd5\" (UniqueName: \"kubernetes.io/projected/05310076-5092-4692-a1a2-69826306ea88-kube-api-access-m5bd5\") pod \"nmstate-operator-694c9596b7-xgp5x\" (UID: \"05310076-5092-4692-a1a2-69826306ea88\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-xgp5x" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.335567 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5bd5\" (UniqueName: \"kubernetes.io/projected/05310076-5092-4692-a1a2-69826306ea88-kube-api-access-m5bd5\") pod \"nmstate-operator-694c9596b7-xgp5x\" (UID: \"05310076-5092-4692-a1a2-69826306ea88\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-xgp5x" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.358939 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5bd5\" (UniqueName: \"kubernetes.io/projected/05310076-5092-4692-a1a2-69826306ea88-kube-api-access-m5bd5\") pod \"nmstate-operator-694c9596b7-xgp5x\" (UID: \"05310076-5092-4692-a1a2-69826306ea88\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-xgp5x" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.374801 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-xgp5x" Feb 17 15:05:11 crc kubenswrapper[4717]: I0217 15:05:11.591967 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-xgp5x"] Feb 17 15:05:12 crc kubenswrapper[4717]: I0217 15:05:12.462312 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-xgp5x" event={"ID":"05310076-5092-4692-a1a2-69826306ea88","Type":"ContainerStarted","Data":"42a71fd7d9c9c5e7de21c2d4a770418847edc2aa8441bc8991638cc54a036407"} Feb 17 15:05:12 crc kubenswrapper[4717]: I0217 15:05:12.481037 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:12 crc kubenswrapper[4717]: I0217 15:05:12.481162 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:13 crc kubenswrapper[4717]: I0217 15:05:13.548597 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mvrcp" podUID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" containerName="registry-server" probeResult="failure" output=< Feb 17 15:05:13 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 17 15:05:13 crc kubenswrapper[4717]: > Feb 17 15:05:14 crc kubenswrapper[4717]: I0217 15:05:14.478483 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-xgp5x" event={"ID":"05310076-5092-4692-a1a2-69826306ea88","Type":"ContainerStarted","Data":"9e25db5828c8612b1855dd9d6362be0c2db273eba33ff961918292a7944a5844"} Feb 17 15:05:14 crc kubenswrapper[4717]: I0217 15:05:14.511548 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-xgp5x" podStartSLOduration=1.4598810150000001 podStartE2EDuration="3.511507605s" podCreationTimestamp="2026-02-17 15:05:11 +0000 UTC" firstStartedPulling="2026-02-17 15:05:11.600009293 +0000 UTC m=+778.015849789" lastFinishedPulling="2026-02-17 15:05:13.651635903 +0000 UTC m=+780.067476379" observedRunningTime="2026-02-17 15:05:14.503310762 +0000 UTC m=+780.919151298" watchObservedRunningTime="2026-02-17 15:05:14.511507605 +0000 UTC m=+780.927348121" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.189448 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-r6698"] Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.191249 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-r6698" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.202841 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-r6698"] Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.207467 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7"] Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.208441 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.212537 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.220191 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-4sl6h" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.234509 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7"] Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.238933 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-kcx4j"] Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.239880 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.346227 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn"] Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.347186 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.349935 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.349934 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.350196 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-s9vbt" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.359741 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn"] Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.374968 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2frlj\" (UniqueName: \"kubernetes.io/projected/28ecae2c-05f9-43e9-ad1a-581b4d6e8fea-kube-api-access-2frlj\") pod \"nmstate-metrics-58c85c668d-r6698\" (UID: \"28ecae2c-05f9-43e9-ad1a-581b4d6e8fea\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-r6698" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.375035 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6c630162-18cc-4f15-80f4-2b1ad7a2e87c-ovs-socket\") pod \"nmstate-handler-kcx4j\" (UID: \"6c630162-18cc-4f15-80f4-2b1ad7a2e87c\") " pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.375058 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvgkm\" (UniqueName: \"kubernetes.io/projected/6c630162-18cc-4f15-80f4-2b1ad7a2e87c-kube-api-access-jvgkm\") pod \"nmstate-handler-kcx4j\" (UID: \"6c630162-18cc-4f15-80f4-2b1ad7a2e87c\") " pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.375186 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv8vl\" (UniqueName: \"kubernetes.io/projected/005457ad-adeb-49ab-aaf7-6043dc2b6021-kube-api-access-kv8vl\") pod \"nmstate-webhook-866bcb46dc-pxrw7\" (UID: \"005457ad-adeb-49ab-aaf7-6043dc2b6021\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.375830 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6c630162-18cc-4f15-80f4-2b1ad7a2e87c-dbus-socket\") pod \"nmstate-handler-kcx4j\" (UID: \"6c630162-18cc-4f15-80f4-2b1ad7a2e87c\") " pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.375914 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/005457ad-adeb-49ab-aaf7-6043dc2b6021-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-pxrw7\" (UID: \"005457ad-adeb-49ab-aaf7-6043dc2b6021\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.376105 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6c630162-18cc-4f15-80f4-2b1ad7a2e87c-nmstate-lock\") pod \"nmstate-handler-kcx4j\" (UID: \"6c630162-18cc-4f15-80f4-2b1ad7a2e87c\") " pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477105 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/005457ad-adeb-49ab-aaf7-6043dc2b6021-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-pxrw7\" (UID: \"005457ad-adeb-49ab-aaf7-6043dc2b6021\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477183 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/303af50d-ffd5-4a3b-9020-b96bdcddc135-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-7rlqn\" (UID: \"303af50d-ffd5-4a3b-9020-b96bdcddc135\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477214 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6c630162-18cc-4f15-80f4-2b1ad7a2e87c-nmstate-lock\") pod \"nmstate-handler-kcx4j\" (UID: \"6c630162-18cc-4f15-80f4-2b1ad7a2e87c\") " pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477233 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/303af50d-ffd5-4a3b-9020-b96bdcddc135-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-7rlqn\" (UID: \"303af50d-ffd5-4a3b-9020-b96bdcddc135\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477259 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2frlj\" (UniqueName: \"kubernetes.io/projected/28ecae2c-05f9-43e9-ad1a-581b4d6e8fea-kube-api-access-2frlj\") pod \"nmstate-metrics-58c85c668d-r6698\" (UID: \"28ecae2c-05f9-43e9-ad1a-581b4d6e8fea\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-r6698" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477295 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp48d\" (UniqueName: \"kubernetes.io/projected/303af50d-ffd5-4a3b-9020-b96bdcddc135-kube-api-access-pp48d\") pod \"nmstate-console-plugin-5c78fc5d65-7rlqn\" (UID: \"303af50d-ffd5-4a3b-9020-b96bdcddc135\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" Feb 17 15:05:20 crc kubenswrapper[4717]: E0217 15:05:20.477291 4717 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477322 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6c630162-18cc-4f15-80f4-2b1ad7a2e87c-ovs-socket\") pod \"nmstate-handler-kcx4j\" (UID: \"6c630162-18cc-4f15-80f4-2b1ad7a2e87c\") " pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477342 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvgkm\" (UniqueName: \"kubernetes.io/projected/6c630162-18cc-4f15-80f4-2b1ad7a2e87c-kube-api-access-jvgkm\") pod \"nmstate-handler-kcx4j\" (UID: \"6c630162-18cc-4f15-80f4-2b1ad7a2e87c\") " pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477362 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv8vl\" (UniqueName: \"kubernetes.io/projected/005457ad-adeb-49ab-aaf7-6043dc2b6021-kube-api-access-kv8vl\") pod \"nmstate-webhook-866bcb46dc-pxrw7\" (UID: \"005457ad-adeb-49ab-aaf7-6043dc2b6021\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" Feb 17 15:05:20 crc kubenswrapper[4717]: E0217 15:05:20.477386 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005457ad-adeb-49ab-aaf7-6043dc2b6021-tls-key-pair podName:005457ad-adeb-49ab-aaf7-6043dc2b6021 nodeName:}" failed. No retries permitted until 2026-02-17 15:05:20.977360635 +0000 UTC m=+787.393201341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/005457ad-adeb-49ab-aaf7-6043dc2b6021-tls-key-pair") pod "nmstate-webhook-866bcb46dc-pxrw7" (UID: "005457ad-adeb-49ab-aaf7-6043dc2b6021") : secret "openshift-nmstate-webhook" not found Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477417 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6c630162-18cc-4f15-80f4-2b1ad7a2e87c-dbus-socket\") pod \"nmstate-handler-kcx4j\" (UID: \"6c630162-18cc-4f15-80f4-2b1ad7a2e87c\") " pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477691 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6c630162-18cc-4f15-80f4-2b1ad7a2e87c-dbus-socket\") pod \"nmstate-handler-kcx4j\" (UID: \"6c630162-18cc-4f15-80f4-2b1ad7a2e87c\") " pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477751 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6c630162-18cc-4f15-80f4-2b1ad7a2e87c-nmstate-lock\") pod \"nmstate-handler-kcx4j\" (UID: \"6c630162-18cc-4f15-80f4-2b1ad7a2e87c\") " pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.477779 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6c630162-18cc-4f15-80f4-2b1ad7a2e87c-ovs-socket\") pod \"nmstate-handler-kcx4j\" (UID: \"6c630162-18cc-4f15-80f4-2b1ad7a2e87c\") " pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.503498 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2frlj\" (UniqueName: \"kubernetes.io/projected/28ecae2c-05f9-43e9-ad1a-581b4d6e8fea-kube-api-access-2frlj\") pod \"nmstate-metrics-58c85c668d-r6698\" (UID: \"28ecae2c-05f9-43e9-ad1a-581b4d6e8fea\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-r6698" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.503657 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvgkm\" (UniqueName: \"kubernetes.io/projected/6c630162-18cc-4f15-80f4-2b1ad7a2e87c-kube-api-access-jvgkm\") pod \"nmstate-handler-kcx4j\" (UID: \"6c630162-18cc-4f15-80f4-2b1ad7a2e87c\") " pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.503931 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv8vl\" (UniqueName: \"kubernetes.io/projected/005457ad-adeb-49ab-aaf7-6043dc2b6021-kube-api-access-kv8vl\") pod \"nmstate-webhook-866bcb46dc-pxrw7\" (UID: \"005457ad-adeb-49ab-aaf7-6043dc2b6021\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.507172 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-r6698" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.562254 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-9558bd49f-x65pc"] Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.565225 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.565904 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.577305 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9558bd49f-x65pc"] Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.578143 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn2gj\" (UniqueName: \"kubernetes.io/projected/c8723d88-2513-40be-b491-dd41b0862f33-kube-api-access-gn2gj\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.578249 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8723d88-2513-40be-b491-dd41b0862f33-service-ca\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.578318 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8723d88-2513-40be-b491-dd41b0862f33-oauth-serving-cert\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.578396 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp48d\" (UniqueName: \"kubernetes.io/projected/303af50d-ffd5-4a3b-9020-b96bdcddc135-kube-api-access-pp48d\") pod \"nmstate-console-plugin-5c78fc5d65-7rlqn\" (UID: \"303af50d-ffd5-4a3b-9020-b96bdcddc135\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.578507 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8723d88-2513-40be-b491-dd41b0862f33-console-config\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.578569 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8723d88-2513-40be-b491-dd41b0862f33-trusted-ca-bundle\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.578642 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8723d88-2513-40be-b491-dd41b0862f33-console-serving-cert\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.578706 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8723d88-2513-40be-b491-dd41b0862f33-console-oauth-config\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.578800 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/303af50d-ffd5-4a3b-9020-b96bdcddc135-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-7rlqn\" (UID: \"303af50d-ffd5-4a3b-9020-b96bdcddc135\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.578881 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/303af50d-ffd5-4a3b-9020-b96bdcddc135-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-7rlqn\" (UID: \"303af50d-ffd5-4a3b-9020-b96bdcddc135\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.582171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/303af50d-ffd5-4a3b-9020-b96bdcddc135-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-7rlqn\" (UID: \"303af50d-ffd5-4a3b-9020-b96bdcddc135\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.582895 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/303af50d-ffd5-4a3b-9020-b96bdcddc135-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-7rlqn\" (UID: \"303af50d-ffd5-4a3b-9020-b96bdcddc135\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.603943 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp48d\" (UniqueName: \"kubernetes.io/projected/303af50d-ffd5-4a3b-9020-b96bdcddc135-kube-api-access-pp48d\") pod \"nmstate-console-plugin-5c78fc5d65-7rlqn\" (UID: \"303af50d-ffd5-4a3b-9020-b96bdcddc135\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.664418 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.679669 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn2gj\" (UniqueName: \"kubernetes.io/projected/c8723d88-2513-40be-b491-dd41b0862f33-kube-api-access-gn2gj\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.679730 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8723d88-2513-40be-b491-dd41b0862f33-service-ca\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.679755 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8723d88-2513-40be-b491-dd41b0862f33-oauth-serving-cert\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.679836 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8723d88-2513-40be-b491-dd41b0862f33-console-config\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.679861 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8723d88-2513-40be-b491-dd41b0862f33-trusted-ca-bundle\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.679891 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8723d88-2513-40be-b491-dd41b0862f33-console-serving-cert\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.679916 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8723d88-2513-40be-b491-dd41b0862f33-console-oauth-config\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.681746 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8723d88-2513-40be-b491-dd41b0862f33-oauth-serving-cert\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.685486 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8723d88-2513-40be-b491-dd41b0862f33-trusted-ca-bundle\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.686694 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8723d88-2513-40be-b491-dd41b0862f33-console-config\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.686689 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8723d88-2513-40be-b491-dd41b0862f33-service-ca\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.688676 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8723d88-2513-40be-b491-dd41b0862f33-console-serving-cert\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.695474 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8723d88-2513-40be-b491-dd41b0862f33-console-oauth-config\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.700812 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn2gj\" (UniqueName: \"kubernetes.io/projected/c8723d88-2513-40be-b491-dd41b0862f33-kube-api-access-gn2gj\") pod \"console-9558bd49f-x65pc\" (UID: \"c8723d88-2513-40be-b491-dd41b0862f33\") " pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.783595 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-r6698"] Feb 17 15:05:20 crc kubenswrapper[4717]: W0217 15:05:20.787515 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28ecae2c_05f9_43e9_ad1a_581b4d6e8fea.slice/crio-d88f49cc770f66f974c9e8959481d09b9f292a47a44ad1587641a0e54faa6bf8 WatchSource:0}: Error finding container d88f49cc770f66f974c9e8959481d09b9f292a47a44ad1587641a0e54faa6bf8: Status 404 returned error can't find the container with id d88f49cc770f66f974c9e8959481d09b9f292a47a44ad1587641a0e54faa6bf8 Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.809257 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.809315 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.857264 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn"] Feb 17 15:05:20 crc kubenswrapper[4717]: W0217 15:05:20.860489 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303af50d_ffd5_4a3b_9020_b96bdcddc135.slice/crio-0244b633b3ed30476978749ca0596e2edb6a53bd4933e85c303a09c76582e781 WatchSource:0}: Error finding container 0244b633b3ed30476978749ca0596e2edb6a53bd4933e85c303a09c76582e781: Status 404 returned error can't find the container with id 0244b633b3ed30476978749ca0596e2edb6a53bd4933e85c303a09c76582e781 Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.967689 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.986368 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/005457ad-adeb-49ab-aaf7-6043dc2b6021-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-pxrw7\" (UID: \"005457ad-adeb-49ab-aaf7-6043dc2b6021\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" Feb 17 15:05:20 crc kubenswrapper[4717]: I0217 15:05:20.991823 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/005457ad-adeb-49ab-aaf7-6043dc2b6021-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-pxrw7\" (UID: \"005457ad-adeb-49ab-aaf7-6043dc2b6021\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" Feb 17 15:05:21 crc kubenswrapper[4717]: I0217 15:05:21.124085 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" Feb 17 15:05:21 crc kubenswrapper[4717]: I0217 15:05:21.191199 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9558bd49f-x65pc"] Feb 17 15:05:21 crc kubenswrapper[4717]: I0217 15:05:21.327338 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7"] Feb 17 15:05:21 crc kubenswrapper[4717]: I0217 15:05:21.529382 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kcx4j" event={"ID":"6c630162-18cc-4f15-80f4-2b1ad7a2e87c","Type":"ContainerStarted","Data":"5a73e2b069bd379fd558d95db510e5808594fa84239fea5aff2ad0bb5ebe397f"} Feb 17 15:05:21 crc kubenswrapper[4717]: I0217 15:05:21.530999 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9558bd49f-x65pc" event={"ID":"c8723d88-2513-40be-b491-dd41b0862f33","Type":"ContainerStarted","Data":"0ae737483e912838ff89c561cf62e6fc7b1f9d74f66d0e9c5bdff9c772c79947"} Feb 17 15:05:21 crc kubenswrapper[4717]: I0217 15:05:21.531056 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9558bd49f-x65pc" event={"ID":"c8723d88-2513-40be-b491-dd41b0862f33","Type":"ContainerStarted","Data":"2aa1bbbc003265f9fd8edceac56d09c023fb9118392555d9d39323314599b585"} Feb 17 15:05:21 crc kubenswrapper[4717]: I0217 15:05:21.532800 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-r6698" event={"ID":"28ecae2c-05f9-43e9-ad1a-581b4d6e8fea","Type":"ContainerStarted","Data":"d88f49cc770f66f974c9e8959481d09b9f292a47a44ad1587641a0e54faa6bf8"} Feb 17 15:05:21 crc kubenswrapper[4717]: I0217 15:05:21.534962 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" event={"ID":"303af50d-ffd5-4a3b-9020-b96bdcddc135","Type":"ContainerStarted","Data":"0244b633b3ed30476978749ca0596e2edb6a53bd4933e85c303a09c76582e781"} Feb 17 15:05:21 crc kubenswrapper[4717]: I0217 15:05:21.536443 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" event={"ID":"005457ad-adeb-49ab-aaf7-6043dc2b6021","Type":"ContainerStarted","Data":"9e4e0fa08428cdd5d95ad3d2ce92e0ad65642d9e4919cc159e826b159db08eee"} Feb 17 15:05:21 crc kubenswrapper[4717]: I0217 15:05:21.557212 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9558bd49f-x65pc" podStartSLOduration=1.557178774 podStartE2EDuration="1.557178774s" podCreationTimestamp="2026-02-17 15:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:05:21.552937074 +0000 UTC m=+787.968777590" watchObservedRunningTime="2026-02-17 15:05:21.557178774 +0000 UTC m=+787.973019290" Feb 17 15:05:22 crc kubenswrapper[4717]: I0217 15:05:22.578123 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:22 crc kubenswrapper[4717]: I0217 15:05:22.620450 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:22 crc kubenswrapper[4717]: I0217 15:05:22.811983 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvrcp"] Feb 17 15:05:23 crc kubenswrapper[4717]: I0217 15:05:23.552470 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-r6698" event={"ID":"28ecae2c-05f9-43e9-ad1a-581b4d6e8fea","Type":"ContainerStarted","Data":"66143692b36828e753c937480b2c86334a56a2e9154f631104d9497db4ab5977"} Feb 17 15:05:23 crc kubenswrapper[4717]: I0217 15:05:23.555597 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" event={"ID":"303af50d-ffd5-4a3b-9020-b96bdcddc135","Type":"ContainerStarted","Data":"85cd4a4d6df65d864a4a8dad1c0769fd57e12fe53618a6034bafa94706806279"} Feb 17 15:05:23 crc kubenswrapper[4717]: I0217 15:05:23.579478 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-7rlqn" podStartSLOduration=1.134385116 podStartE2EDuration="3.57945819s" podCreationTimestamp="2026-02-17 15:05:20 +0000 UTC" firstStartedPulling="2026-02-17 15:05:20.862874504 +0000 UTC m=+787.278714980" lastFinishedPulling="2026-02-17 15:05:23.307947578 +0000 UTC m=+789.723788054" observedRunningTime="2026-02-17 15:05:23.574256422 +0000 UTC m=+789.990096908" watchObservedRunningTime="2026-02-17 15:05:23.57945819 +0000 UTC m=+789.995298666" Feb 17 15:05:24 crc kubenswrapper[4717]: I0217 15:05:24.560829 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" event={"ID":"005457ad-adeb-49ab-aaf7-6043dc2b6021","Type":"ContainerStarted","Data":"50eeba5892aebb41ededed6d6a1597edc8c5a2b32fc7e031f55fec39535e5754"} Feb 17 15:05:24 crc kubenswrapper[4717]: I0217 15:05:24.561452 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" Feb 17 15:05:24 crc kubenswrapper[4717]: I0217 15:05:24.562944 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kcx4j" event={"ID":"6c630162-18cc-4f15-80f4-2b1ad7a2e87c","Type":"ContainerStarted","Data":"eac2d5063140d992e9c246dc5e9351833b3e43ad52eaf34bde681e460c339043"} Feb 17 15:05:24 crc kubenswrapper[4717]: I0217 15:05:24.563443 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mvrcp" podUID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" containerName="registry-server" containerID="cri-o://3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0" gracePeriod=2 Feb 17 15:05:24 crc kubenswrapper[4717]: I0217 15:05:24.580286 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" podStartSLOduration=2.287494137 podStartE2EDuration="4.580263095s" podCreationTimestamp="2026-02-17 15:05:20 +0000 UTC" firstStartedPulling="2026-02-17 15:05:21.332931135 +0000 UTC m=+787.748771611" lastFinishedPulling="2026-02-17 15:05:23.625700093 +0000 UTC m=+790.041540569" observedRunningTime="2026-02-17 15:05:24.576898599 +0000 UTC m=+790.992739095" watchObservedRunningTime="2026-02-17 15:05:24.580263095 +0000 UTC m=+790.996103571" Feb 17 15:05:24 crc kubenswrapper[4717]: I0217 15:05:24.601631 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-kcx4j" podStartSLOduration=1.8740299230000002 podStartE2EDuration="4.601609901s" podCreationTimestamp="2026-02-17 15:05:20 +0000 UTC" firstStartedPulling="2026-02-17 15:05:20.608457378 +0000 UTC m=+787.024297844" lastFinishedPulling="2026-02-17 15:05:23.336037346 +0000 UTC m=+789.751877822" observedRunningTime="2026-02-17 15:05:24.596804334 +0000 UTC m=+791.012644830" watchObservedRunningTime="2026-02-17 15:05:24.601609901 +0000 UTC m=+791.017450377" Feb 17 15:05:24 crc kubenswrapper[4717]: I0217 15:05:24.962077 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.144179 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b4e09d-d9b4-4c76-83fa-d78113e1599e-catalog-content\") pod \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\" (UID: \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\") " Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.144612 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b4e09d-d9b4-4c76-83fa-d78113e1599e-utilities\") pod \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\" (UID: \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\") " Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.144648 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpt8r\" (UniqueName: \"kubernetes.io/projected/07b4e09d-d9b4-4c76-83fa-d78113e1599e-kube-api-access-mpt8r\") pod \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\" (UID: \"07b4e09d-d9b4-4c76-83fa-d78113e1599e\") " Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.146193 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07b4e09d-d9b4-4c76-83fa-d78113e1599e-utilities" (OuterVolumeSpecName: "utilities") pod "07b4e09d-d9b4-4c76-83fa-d78113e1599e" (UID: "07b4e09d-d9b4-4c76-83fa-d78113e1599e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.150845 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07b4e09d-d9b4-4c76-83fa-d78113e1599e-kube-api-access-mpt8r" (OuterVolumeSpecName: "kube-api-access-mpt8r") pod "07b4e09d-d9b4-4c76-83fa-d78113e1599e" (UID: "07b4e09d-d9b4-4c76-83fa-d78113e1599e"). InnerVolumeSpecName "kube-api-access-mpt8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.246069 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b4e09d-d9b4-4c76-83fa-d78113e1599e-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.246136 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpt8r\" (UniqueName: \"kubernetes.io/projected/07b4e09d-d9b4-4c76-83fa-d78113e1599e-kube-api-access-mpt8r\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.265039 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07b4e09d-d9b4-4c76-83fa-d78113e1599e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07b4e09d-d9b4-4c76-83fa-d78113e1599e" (UID: "07b4e09d-d9b4-4c76-83fa-d78113e1599e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.347306 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b4e09d-d9b4-4c76-83fa-d78113e1599e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.567248 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.572869 4717 generic.go:334] "Generic (PLEG): container finished" podID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" containerID="3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0" exitCode=0 Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.572941 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvrcp" event={"ID":"07b4e09d-d9b4-4c76-83fa-d78113e1599e","Type":"ContainerDied","Data":"3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0"} Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.572997 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvrcp" event={"ID":"07b4e09d-d9b4-4c76-83fa-d78113e1599e","Type":"ContainerDied","Data":"83fb04d96dd413f202e2b92024cf7f4860a01bed4be1da1047b41bf1bc5eb2b9"} Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.572960 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvrcp" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.573025 4717 scope.go:117] "RemoveContainer" containerID="3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.600966 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvrcp"] Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.606045 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mvrcp"] Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.813203 4717 scope.go:117] "RemoveContainer" containerID="8513caf3012b5eb252f0b0ad01042d8cabeae62b5df21734ca13ce7d4f123dc4" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.859557 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" path="/var/lib/kubelet/pods/07b4e09d-d9b4-4c76-83fa-d78113e1599e/volumes" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.882399 4717 scope.go:117] "RemoveContainer" containerID="4203c6299ef23873ae3cc5e121bcd0af47a78328245bcac721d913a272b50ea1" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.906751 4717 scope.go:117] "RemoveContainer" containerID="3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0" Feb 17 15:05:25 crc kubenswrapper[4717]: E0217 15:05:25.907556 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0\": container with ID starting with 3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0 not found: ID does not exist" containerID="3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.907600 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0"} err="failed to get container status \"3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0\": rpc error: code = NotFound desc = could not find container \"3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0\": container with ID starting with 3be7101a77d43aef6a0c2657a0f823ced0bb5c600730801dfc287099a9d0b8d0 not found: ID does not exist" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.907683 4717 scope.go:117] "RemoveContainer" containerID="8513caf3012b5eb252f0b0ad01042d8cabeae62b5df21734ca13ce7d4f123dc4" Feb 17 15:05:25 crc kubenswrapper[4717]: E0217 15:05:25.908315 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8513caf3012b5eb252f0b0ad01042d8cabeae62b5df21734ca13ce7d4f123dc4\": container with ID starting with 8513caf3012b5eb252f0b0ad01042d8cabeae62b5df21734ca13ce7d4f123dc4 not found: ID does not exist" containerID="8513caf3012b5eb252f0b0ad01042d8cabeae62b5df21734ca13ce7d4f123dc4" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.908341 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8513caf3012b5eb252f0b0ad01042d8cabeae62b5df21734ca13ce7d4f123dc4"} err="failed to get container status \"8513caf3012b5eb252f0b0ad01042d8cabeae62b5df21734ca13ce7d4f123dc4\": rpc error: code = NotFound desc = could not find container \"8513caf3012b5eb252f0b0ad01042d8cabeae62b5df21734ca13ce7d4f123dc4\": container with ID starting with 8513caf3012b5eb252f0b0ad01042d8cabeae62b5df21734ca13ce7d4f123dc4 not found: ID does not exist" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.908359 4717 scope.go:117] "RemoveContainer" containerID="4203c6299ef23873ae3cc5e121bcd0af47a78328245bcac721d913a272b50ea1" Feb 17 15:05:25 crc kubenswrapper[4717]: E0217 15:05:25.908770 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4203c6299ef23873ae3cc5e121bcd0af47a78328245bcac721d913a272b50ea1\": container with ID starting with 4203c6299ef23873ae3cc5e121bcd0af47a78328245bcac721d913a272b50ea1 not found: ID does not exist" containerID="4203c6299ef23873ae3cc5e121bcd0af47a78328245bcac721d913a272b50ea1" Feb 17 15:05:25 crc kubenswrapper[4717]: I0217 15:05:25.908809 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4203c6299ef23873ae3cc5e121bcd0af47a78328245bcac721d913a272b50ea1"} err="failed to get container status \"4203c6299ef23873ae3cc5e121bcd0af47a78328245bcac721d913a272b50ea1\": rpc error: code = NotFound desc = could not find container \"4203c6299ef23873ae3cc5e121bcd0af47a78328245bcac721d913a272b50ea1\": container with ID starting with 4203c6299ef23873ae3cc5e121bcd0af47a78328245bcac721d913a272b50ea1 not found: ID does not exist" Feb 17 15:05:26 crc kubenswrapper[4717]: I0217 15:05:26.581416 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-r6698" event={"ID":"28ecae2c-05f9-43e9-ad1a-581b4d6e8fea","Type":"ContainerStarted","Data":"b909c6451a861baef4f10afd8e5c42f1b06f34c126ad12c64612311cbc5a1c88"} Feb 17 15:05:26 crc kubenswrapper[4717]: I0217 15:05:26.609574 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-r6698" podStartSLOduration=1.489894272 podStartE2EDuration="6.609549639s" podCreationTimestamp="2026-02-17 15:05:20 +0000 UTC" firstStartedPulling="2026-02-17 15:05:20.789173621 +0000 UTC m=+787.205014097" lastFinishedPulling="2026-02-17 15:05:25.908828988 +0000 UTC m=+792.324669464" observedRunningTime="2026-02-17 15:05:26.607280965 +0000 UTC m=+793.023121491" watchObservedRunningTime="2026-02-17 15:05:26.609549639 +0000 UTC m=+793.025390135" Feb 17 15:05:30 crc kubenswrapper[4717]: I0217 15:05:30.593930 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-kcx4j" Feb 17 15:05:30 crc kubenswrapper[4717]: I0217 15:05:30.968304 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:30 crc kubenswrapper[4717]: I0217 15:05:30.968378 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:30 crc kubenswrapper[4717]: I0217 15:05:30.973406 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:31 crc kubenswrapper[4717]: I0217 15:05:31.619621 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9558bd49f-x65pc" Feb 17 15:05:31 crc kubenswrapper[4717]: I0217 15:05:31.674928 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-fbqb2"] Feb 17 15:05:41 crc kubenswrapper[4717]: I0217 15:05:41.131316 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pxrw7" Feb 17 15:05:50 crc kubenswrapper[4717]: I0217 15:05:50.808480 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:05:50 crc kubenswrapper[4717]: I0217 15:05:50.809064 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.631487 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m"] Feb 17 15:05:53 crc kubenswrapper[4717]: E0217 15:05:53.632496 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" containerName="extract-content" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.632513 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" containerName="extract-content" Feb 17 15:05:53 crc kubenswrapper[4717]: E0217 15:05:53.632541 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" containerName="registry-server" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.632550 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" containerName="registry-server" Feb 17 15:05:53 crc kubenswrapper[4717]: E0217 15:05:53.632567 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" containerName="extract-utilities" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.632577 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" containerName="extract-utilities" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.632704 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="07b4e09d-d9b4-4c76-83fa-d78113e1599e" containerName="registry-server" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.633801 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.640988 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.644147 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m"] Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.644312 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2c7425a-c6a2-4c42-b731-a24715c81039-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m\" (UID: \"d2c7425a-c6a2-4c42-b731-a24715c81039\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.644647 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75cdv\" (UniqueName: \"kubernetes.io/projected/d2c7425a-c6a2-4c42-b731-a24715c81039-kube-api-access-75cdv\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m\" (UID: \"d2c7425a-c6a2-4c42-b731-a24715c81039\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.644689 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2c7425a-c6a2-4c42-b731-a24715c81039-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m\" (UID: \"d2c7425a-c6a2-4c42-b731-a24715c81039\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.746145 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75cdv\" (UniqueName: \"kubernetes.io/projected/d2c7425a-c6a2-4c42-b731-a24715c81039-kube-api-access-75cdv\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m\" (UID: \"d2c7425a-c6a2-4c42-b731-a24715c81039\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.746210 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2c7425a-c6a2-4c42-b731-a24715c81039-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m\" (UID: \"d2c7425a-c6a2-4c42-b731-a24715c81039\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.746264 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2c7425a-c6a2-4c42-b731-a24715c81039-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m\" (UID: \"d2c7425a-c6a2-4c42-b731-a24715c81039\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.746777 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2c7425a-c6a2-4c42-b731-a24715c81039-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m\" (UID: \"d2c7425a-c6a2-4c42-b731-a24715c81039\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.747138 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2c7425a-c6a2-4c42-b731-a24715c81039-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m\" (UID: \"d2c7425a-c6a2-4c42-b731-a24715c81039\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.768132 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75cdv\" (UniqueName: \"kubernetes.io/projected/d2c7425a-c6a2-4c42-b731-a24715c81039-kube-api-access-75cdv\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m\" (UID: \"d2c7425a-c6a2-4c42-b731-a24715c81039\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:53 crc kubenswrapper[4717]: I0217 15:05:53.959046 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:54 crc kubenswrapper[4717]: I0217 15:05:54.218249 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m"] Feb 17 15:05:54 crc kubenswrapper[4717]: I0217 15:05:54.792575 4717 generic.go:334] "Generic (PLEG): container finished" podID="d2c7425a-c6a2-4c42-b731-a24715c81039" containerID="71b01b8fe5dd2d818cd346389a33423ce4f08f9fbef1039d3504a82427292584" exitCode=0 Feb 17 15:05:54 crc kubenswrapper[4717]: I0217 15:05:54.792764 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" event={"ID":"d2c7425a-c6a2-4c42-b731-a24715c81039","Type":"ContainerDied","Data":"71b01b8fe5dd2d818cd346389a33423ce4f08f9fbef1039d3504a82427292584"} Feb 17 15:05:54 crc kubenswrapper[4717]: I0217 15:05:54.793147 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" event={"ID":"d2c7425a-c6a2-4c42-b731-a24715c81039","Type":"ContainerStarted","Data":"ac7ff2c646ccbe97ca65f38b92fd93ae45cd8de5d9dfe810c1bfdfea9e7df561"} Feb 17 15:05:56 crc kubenswrapper[4717]: I0217 15:05:56.725387 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-fbqb2" podUID="d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" containerName="console" containerID="cri-o://d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50" gracePeriod=15 Feb 17 15:05:56 crc kubenswrapper[4717]: I0217 15:05:56.807583 4717 generic.go:334] "Generic (PLEG): container finished" podID="d2c7425a-c6a2-4c42-b731-a24715c81039" containerID="9316832d00ae409b850009fd5cb24b5a513a5d56bcd875fa488d269f1b89016f" exitCode=0 Feb 17 15:05:56 crc kubenswrapper[4717]: I0217 15:05:56.807663 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" event={"ID":"d2c7425a-c6a2-4c42-b731-a24715c81039","Type":"ContainerDied","Data":"9316832d00ae409b850009fd5cb24b5a513a5d56bcd875fa488d269f1b89016f"} Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.130176 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-fbqb2_d03cf3b1-f05b-4c42-8e59-fb05060b5cb4/console/0.log" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.130238 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.298994 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5tbr\" (UniqueName: \"kubernetes.io/projected/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-kube-api-access-x5tbr\") pod \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.299173 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-serving-cert\") pod \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.299355 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-oauth-config\") pod \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.299405 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-trusted-ca-bundle\") pod \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.299437 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-config\") pod \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.299466 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-service-ca\") pod \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.300255 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-oauth-serving-cert\") pod \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\" (UID: \"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4\") " Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.300142 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-service-ca" (OuterVolumeSpecName: "service-ca") pod "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" (UID: "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.300176 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" (UID: "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.300191 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-config" (OuterVolumeSpecName: "console-config") pod "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" (UID: "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.300720 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" (UID: "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.300993 4717 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.301010 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.301020 4717 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.301031 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.304992 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" (UID: "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.305382 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" (UID: "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.306021 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-kube-api-access-x5tbr" (OuterVolumeSpecName: "kube-api-access-x5tbr") pod "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" (UID: "d03cf3b1-f05b-4c42-8e59-fb05060b5cb4"). InnerVolumeSpecName "kube-api-access-x5tbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.402442 4717 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.402482 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5tbr\" (UniqueName: \"kubernetes.io/projected/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-kube-api-access-x5tbr\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.402497 4717 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.823557 4717 generic.go:334] "Generic (PLEG): container finished" podID="d2c7425a-c6a2-4c42-b731-a24715c81039" containerID="230c9a6ba3d1cb2627b2a9f05fbd98fed85775d66adc0d29b9448b93a038e411" exitCode=0 Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.823711 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" event={"ID":"d2c7425a-c6a2-4c42-b731-a24715c81039","Type":"ContainerDied","Data":"230c9a6ba3d1cb2627b2a9f05fbd98fed85775d66adc0d29b9448b93a038e411"} Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.827298 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-fbqb2_d03cf3b1-f05b-4c42-8e59-fb05060b5cb4/console/0.log" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.827402 4717 generic.go:334] "Generic (PLEG): container finished" podID="d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" containerID="d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50" exitCode=2 Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.827464 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fbqb2" event={"ID":"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4","Type":"ContainerDied","Data":"d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50"} Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.827480 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fbqb2" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.827513 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fbqb2" event={"ID":"d03cf3b1-f05b-4c42-8e59-fb05060b5cb4","Type":"ContainerDied","Data":"78debf03a6a3d7126d4f3abfe5cfcc54f173f1b6a23642d0379409c9c1b4b8c3"} Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.827548 4717 scope.go:117] "RemoveContainer" containerID="d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.864510 4717 scope.go:117] "RemoveContainer" containerID="d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50" Feb 17 15:05:57 crc kubenswrapper[4717]: E0217 15:05:57.865404 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50\": container with ID starting with d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50 not found: ID does not exist" containerID="d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.865549 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50"} err="failed to get container status \"d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50\": rpc error: code = NotFound desc = could not find container \"d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50\": container with ID starting with d7c95c1757c91a4b67c3489f01bc598843291f7353988ebc207d219ce74d8d50 not found: ID does not exist" Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.872338 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-fbqb2"] Feb 17 15:05:57 crc kubenswrapper[4717]: I0217 15:05:57.876752 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-fbqb2"] Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.057376 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.125624 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75cdv\" (UniqueName: \"kubernetes.io/projected/d2c7425a-c6a2-4c42-b731-a24715c81039-kube-api-access-75cdv\") pod \"d2c7425a-c6a2-4c42-b731-a24715c81039\" (UID: \"d2c7425a-c6a2-4c42-b731-a24715c81039\") " Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.125753 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2c7425a-c6a2-4c42-b731-a24715c81039-util\") pod \"d2c7425a-c6a2-4c42-b731-a24715c81039\" (UID: \"d2c7425a-c6a2-4c42-b731-a24715c81039\") " Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.125871 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2c7425a-c6a2-4c42-b731-a24715c81039-bundle\") pod \"d2c7425a-c6a2-4c42-b731-a24715c81039\" (UID: \"d2c7425a-c6a2-4c42-b731-a24715c81039\") " Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.127508 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2c7425a-c6a2-4c42-b731-a24715c81039-bundle" (OuterVolumeSpecName: "bundle") pod "d2c7425a-c6a2-4c42-b731-a24715c81039" (UID: "d2c7425a-c6a2-4c42-b731-a24715c81039"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.139072 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2c7425a-c6a2-4c42-b731-a24715c81039-util" (OuterVolumeSpecName: "util") pod "d2c7425a-c6a2-4c42-b731-a24715c81039" (UID: "d2c7425a-c6a2-4c42-b731-a24715c81039"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.143636 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c7425a-c6a2-4c42-b731-a24715c81039-kube-api-access-75cdv" (OuterVolumeSpecName: "kube-api-access-75cdv") pod "d2c7425a-c6a2-4c42-b731-a24715c81039" (UID: "d2c7425a-c6a2-4c42-b731-a24715c81039"). InnerVolumeSpecName "kube-api-access-75cdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.227409 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2c7425a-c6a2-4c42-b731-a24715c81039-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.227450 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75cdv\" (UniqueName: \"kubernetes.io/projected/d2c7425a-c6a2-4c42-b731-a24715c81039-kube-api-access-75cdv\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.227462 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2c7425a-c6a2-4c42-b731-a24715c81039-util\") on node \"crc\" DevicePath \"\"" Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.846544 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.854437 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" path="/var/lib/kubelet/pods/d03cf3b1-f05b-4c42-8e59-fb05060b5cb4/volumes" Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.855146 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m" event={"ID":"d2c7425a-c6a2-4c42-b731-a24715c81039","Type":"ContainerDied","Data":"ac7ff2c646ccbe97ca65f38b92fd93ae45cd8de5d9dfe810c1bfdfea9e7df561"} Feb 17 15:05:59 crc kubenswrapper[4717]: I0217 15:05:59.855184 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac7ff2c646ccbe97ca65f38b92fd93ae45cd8de5d9dfe810c1bfdfea9e7df561" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.634632 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-774c978687-jj472"] Feb 17 15:06:08 crc kubenswrapper[4717]: E0217 15:06:08.635494 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7425a-c6a2-4c42-b731-a24715c81039" containerName="util" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.635515 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7425a-c6a2-4c42-b731-a24715c81039" containerName="util" Feb 17 15:06:08 crc kubenswrapper[4717]: E0217 15:06:08.635539 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7425a-c6a2-4c42-b731-a24715c81039" containerName="pull" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.635546 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7425a-c6a2-4c42-b731-a24715c81039" containerName="pull" Feb 17 15:06:08 crc kubenswrapper[4717]: E0217 15:06:08.635558 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7425a-c6a2-4c42-b731-a24715c81039" containerName="extract" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.635567 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7425a-c6a2-4c42-b731-a24715c81039" containerName="extract" Feb 17 15:06:08 crc kubenswrapper[4717]: E0217 15:06:08.635578 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" containerName="console" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.635586 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" containerName="console" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.635717 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d03cf3b1-f05b-4c42-8e59-fb05060b5cb4" containerName="console" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.635729 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7425a-c6a2-4c42-b731-a24715c81039" containerName="extract" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.636257 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.638444 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.638573 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.639024 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.639058 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.639819 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-xl774" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.671107 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-774c978687-jj472"] Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.767685 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/656da445-df5b-402b-8347-70aa45a92159-apiservice-cert\") pod \"metallb-operator-controller-manager-774c978687-jj472\" (UID: \"656da445-df5b-402b-8347-70aa45a92159\") " pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.767780 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jskrv\" (UniqueName: \"kubernetes.io/projected/656da445-df5b-402b-8347-70aa45a92159-kube-api-access-jskrv\") pod \"metallb-operator-controller-manager-774c978687-jj472\" (UID: \"656da445-df5b-402b-8347-70aa45a92159\") " pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.767835 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/656da445-df5b-402b-8347-70aa45a92159-webhook-cert\") pod \"metallb-operator-controller-manager-774c978687-jj472\" (UID: \"656da445-df5b-402b-8347-70aa45a92159\") " pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.868778 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/656da445-df5b-402b-8347-70aa45a92159-apiservice-cert\") pod \"metallb-operator-controller-manager-774c978687-jj472\" (UID: \"656da445-df5b-402b-8347-70aa45a92159\") " pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.868865 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jskrv\" (UniqueName: \"kubernetes.io/projected/656da445-df5b-402b-8347-70aa45a92159-kube-api-access-jskrv\") pod \"metallb-operator-controller-manager-774c978687-jj472\" (UID: \"656da445-df5b-402b-8347-70aa45a92159\") " pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.868917 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/656da445-df5b-402b-8347-70aa45a92159-webhook-cert\") pod \"metallb-operator-controller-manager-774c978687-jj472\" (UID: \"656da445-df5b-402b-8347-70aa45a92159\") " pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.876960 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/656da445-df5b-402b-8347-70aa45a92159-webhook-cert\") pod \"metallb-operator-controller-manager-774c978687-jj472\" (UID: \"656da445-df5b-402b-8347-70aa45a92159\") " pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.877050 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/656da445-df5b-402b-8347-70aa45a92159-apiservice-cert\") pod \"metallb-operator-controller-manager-774c978687-jj472\" (UID: \"656da445-df5b-402b-8347-70aa45a92159\") " pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.911169 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jskrv\" (UniqueName: \"kubernetes.io/projected/656da445-df5b-402b-8347-70aa45a92159-kube-api-access-jskrv\") pod \"metallb-operator-controller-manager-774c978687-jj472\" (UID: \"656da445-df5b-402b-8347-70aa45a92159\") " pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:08 crc kubenswrapper[4717]: I0217 15:06:08.967713 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.082926 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn"] Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.083958 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.088498 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.088601 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-rxk25" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.090392 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.111682 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn"] Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.278238 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5b9d38f5-133a-43ed-bfef-8e5a27fa200c-apiservice-cert\") pod \"metallb-operator-webhook-server-56fd4fb65-pddnn\" (UID: \"5b9d38f5-133a-43ed-bfef-8e5a27fa200c\") " pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.278651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5b9d38f5-133a-43ed-bfef-8e5a27fa200c-webhook-cert\") pod \"metallb-operator-webhook-server-56fd4fb65-pddnn\" (UID: \"5b9d38f5-133a-43ed-bfef-8e5a27fa200c\") " pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.278693 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9c8c\" (UniqueName: \"kubernetes.io/projected/5b9d38f5-133a-43ed-bfef-8e5a27fa200c-kube-api-access-q9c8c\") pod \"metallb-operator-webhook-server-56fd4fb65-pddnn\" (UID: \"5b9d38f5-133a-43ed-bfef-8e5a27fa200c\") " pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.380332 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9c8c\" (UniqueName: \"kubernetes.io/projected/5b9d38f5-133a-43ed-bfef-8e5a27fa200c-kube-api-access-q9c8c\") pod \"metallb-operator-webhook-server-56fd4fb65-pddnn\" (UID: \"5b9d38f5-133a-43ed-bfef-8e5a27fa200c\") " pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.380434 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5b9d38f5-133a-43ed-bfef-8e5a27fa200c-apiservice-cert\") pod \"metallb-operator-webhook-server-56fd4fb65-pddnn\" (UID: \"5b9d38f5-133a-43ed-bfef-8e5a27fa200c\") " pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.380494 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5b9d38f5-133a-43ed-bfef-8e5a27fa200c-webhook-cert\") pod \"metallb-operator-webhook-server-56fd4fb65-pddnn\" (UID: \"5b9d38f5-133a-43ed-bfef-8e5a27fa200c\") " pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.386703 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5b9d38f5-133a-43ed-bfef-8e5a27fa200c-apiservice-cert\") pod \"metallb-operator-webhook-server-56fd4fb65-pddnn\" (UID: \"5b9d38f5-133a-43ed-bfef-8e5a27fa200c\") " pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.386703 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5b9d38f5-133a-43ed-bfef-8e5a27fa200c-webhook-cert\") pod \"metallb-operator-webhook-server-56fd4fb65-pddnn\" (UID: \"5b9d38f5-133a-43ed-bfef-8e5a27fa200c\") " pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.401816 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9c8c\" (UniqueName: \"kubernetes.io/projected/5b9d38f5-133a-43ed-bfef-8e5a27fa200c-kube-api-access-q9c8c\") pod \"metallb-operator-webhook-server-56fd4fb65-pddnn\" (UID: \"5b9d38f5-133a-43ed-bfef-8e5a27fa200c\") " pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.529771 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-774c978687-jj472"] Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.700707 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:09 crc kubenswrapper[4717]: I0217 15:06:09.902301 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" event={"ID":"656da445-df5b-402b-8347-70aa45a92159","Type":"ContainerStarted","Data":"9367dc024205823e1895e487dfe64d712454f9b6b941690c83568d5df69128e1"} Feb 17 15:06:10 crc kubenswrapper[4717]: I0217 15:06:10.029137 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn"] Feb 17 15:06:10 crc kubenswrapper[4717]: I0217 15:06:10.909928 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" event={"ID":"5b9d38f5-133a-43ed-bfef-8e5a27fa200c","Type":"ContainerStarted","Data":"81a555929f5ab4c3a0ca3ebcf04a2c56981f27aac1b343a112c4e21dcf677cd2"} Feb 17 15:06:13 crc kubenswrapper[4717]: I0217 15:06:13.932145 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" event={"ID":"656da445-df5b-402b-8347-70aa45a92159","Type":"ContainerStarted","Data":"2175255c62bd894caf8d3af39dd73859a2908c8802f830787ff2e4f0711ba9c5"} Feb 17 15:06:13 crc kubenswrapper[4717]: I0217 15:06:13.932507 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:15 crc kubenswrapper[4717]: I0217 15:06:15.888347 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" podStartSLOduration=4.306038554 podStartE2EDuration="7.888320397s" podCreationTimestamp="2026-02-17 15:06:08 +0000 UTC" firstStartedPulling="2026-02-17 15:06:09.538518501 +0000 UTC m=+835.954358977" lastFinishedPulling="2026-02-17 15:06:13.120800344 +0000 UTC m=+839.536640820" observedRunningTime="2026-02-17 15:06:13.955227113 +0000 UTC m=+840.371067599" watchObservedRunningTime="2026-02-17 15:06:15.888320397 +0000 UTC m=+842.304160873" Feb 17 15:06:15 crc kubenswrapper[4717]: I0217 15:06:15.947324 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" event={"ID":"5b9d38f5-133a-43ed-bfef-8e5a27fa200c","Type":"ContainerStarted","Data":"3ba0ff2374de4ecc70aa172ce37f0a64682d6216b4bc41ea71a1083ca146dd20"} Feb 17 15:06:15 crc kubenswrapper[4717]: I0217 15:06:15.948269 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:15 crc kubenswrapper[4717]: I0217 15:06:15.967259 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" podStartSLOduration=1.650140693 podStartE2EDuration="6.967221788s" podCreationTimestamp="2026-02-17 15:06:09 +0000 UTC" firstStartedPulling="2026-02-17 15:06:10.034118207 +0000 UTC m=+836.449958683" lastFinishedPulling="2026-02-17 15:06:15.351199272 +0000 UTC m=+841.767039778" observedRunningTime="2026-02-17 15:06:15.964738877 +0000 UTC m=+842.380579353" watchObservedRunningTime="2026-02-17 15:06:15.967221788 +0000 UTC m=+842.383062264" Feb 17 15:06:20 crc kubenswrapper[4717]: I0217 15:06:20.808190 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:06:20 crc kubenswrapper[4717]: I0217 15:06:20.809057 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:06:20 crc kubenswrapper[4717]: I0217 15:06:20.809139 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 15:06:20 crc kubenswrapper[4717]: I0217 15:06:20.809898 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6cb2d452429431b113e7eb8a0c1c4fb59dfbfa50aec6a5702cb9844fb01cb5c"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:06:20 crc kubenswrapper[4717]: I0217 15:06:20.809958 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://a6cb2d452429431b113e7eb8a0c1c4fb59dfbfa50aec6a5702cb9844fb01cb5c" gracePeriod=600 Feb 17 15:06:20 crc kubenswrapper[4717]: I0217 15:06:20.996606 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="a6cb2d452429431b113e7eb8a0c1c4fb59dfbfa50aec6a5702cb9844fb01cb5c" exitCode=0 Feb 17 15:06:20 crc kubenswrapper[4717]: I0217 15:06:20.996686 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"a6cb2d452429431b113e7eb8a0c1c4fb59dfbfa50aec6a5702cb9844fb01cb5c"} Feb 17 15:06:20 crc kubenswrapper[4717]: I0217 15:06:20.996729 4717 scope.go:117] "RemoveContainer" containerID="2be4d6f9ba1b351a5eaa6776c534f8fa0de4b5afbe4f304b46d3b99ccaa26b45" Feb 17 15:06:22 crc kubenswrapper[4717]: I0217 15:06:22.004827 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"894debb3d49a5afafc8d152c1e296cd8509036d91968011b7ffc16cede4826fe"} Feb 17 15:06:29 crc kubenswrapper[4717]: I0217 15:06:29.706549 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-56fd4fb65-pddnn" Feb 17 15:06:48 crc kubenswrapper[4717]: I0217 15:06:48.970276 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-774c978687-jj472" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.725687 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-njs4q"] Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.728188 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.729946 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-wpqw7" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.730504 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.734061 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.735590 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl"] Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.736835 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.741234 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.753602 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl"] Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.828318 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-7k7sc"] Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.829863 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.831762 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.837687 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-vc7q2"] Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.839273 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vc7q2" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.841211 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acf9f85d-e0ff-419b-8b56-ceda8ffeb28a-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-8f7bl\" (UID: \"acf9f85d-e0ff-419b-8b56-ceda8ffeb28a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.841301 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-frr-startup\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.841333 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l72tm\" (UniqueName: \"kubernetes.io/projected/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-kube-api-access-l72tm\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.841425 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-reloader\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.841472 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-frr-sockets\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.841488 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.841507 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg9c5\" (UniqueName: \"kubernetes.io/projected/acf9f85d-e0ff-419b-8b56-ceda8ffeb28a-kube-api-access-hg9c5\") pod \"frr-k8s-webhook-server-78b44bf5bb-8f7bl\" (UID: \"acf9f85d-e0ff-419b-8b56-ceda8ffeb28a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.841546 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-metrics-certs\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.841715 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-metrics\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.841739 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-6tsfg" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.841756 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-frr-conf\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.841856 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.844524 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.854928 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-7k7sc"] Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943308 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-frr-startup\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943376 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72e0c267-99f7-4d36-83e8-219560a63667-cert\") pod \"controller-69bbfbf88f-7k7sc\" (UID: \"72e0c267-99f7-4d36-83e8-219560a63667\") " pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943411 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l72tm\" (UniqueName: \"kubernetes.io/projected/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-kube-api-access-l72tm\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943437 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-reloader\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-frr-sockets\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943660 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggckf\" (UniqueName: \"kubernetes.io/projected/72e0c267-99f7-4d36-83e8-219560a63667-kube-api-access-ggckf\") pod \"controller-69bbfbf88f-7k7sc\" (UID: \"72e0c267-99f7-4d36-83e8-219560a63667\") " pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943685 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zfw5\" (UniqueName: \"kubernetes.io/projected/0179ef66-9a6c-440d-8606-f2f040fe7b44-kube-api-access-2zfw5\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943748 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg9c5\" (UniqueName: \"kubernetes.io/projected/acf9f85d-e0ff-419b-8b56-ceda8ffeb28a-kube-api-access-hg9c5\") pod \"frr-k8s-webhook-server-78b44bf5bb-8f7bl\" (UID: \"acf9f85d-e0ff-419b-8b56-ceda8ffeb28a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943786 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0179ef66-9a6c-440d-8606-f2f040fe7b44-memberlist\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943822 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-reloader\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943870 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-metrics-certs\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943903 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-metrics\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943930 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72e0c267-99f7-4d36-83e8-219560a63667-metrics-certs\") pod \"controller-69bbfbf88f-7k7sc\" (UID: \"72e0c267-99f7-4d36-83e8-219560a63667\") " pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.943960 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-frr-conf\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.944003 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acf9f85d-e0ff-419b-8b56-ceda8ffeb28a-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-8f7bl\" (UID: \"acf9f85d-e0ff-419b-8b56-ceda8ffeb28a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.944035 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0179ef66-9a6c-440d-8606-f2f040fe7b44-metallb-excludel2\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.944059 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-frr-sockets\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.944070 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0179ef66-9a6c-440d-8606-f2f040fe7b44-metrics-certs\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.944404 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-metrics\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.944688 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-frr-startup\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.944867 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-frr-conf\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.950757 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-metrics-certs\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.967267 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acf9f85d-e0ff-419b-8b56-ceda8ffeb28a-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-8f7bl\" (UID: \"acf9f85d-e0ff-419b-8b56-ceda8ffeb28a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.967424 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l72tm\" (UniqueName: \"kubernetes.io/projected/fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321-kube-api-access-l72tm\") pod \"frr-k8s-njs4q\" (UID: \"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321\") " pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:49 crc kubenswrapper[4717]: I0217 15:06:49.974531 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg9c5\" (UniqueName: \"kubernetes.io/projected/acf9f85d-e0ff-419b-8b56-ceda8ffeb28a-kube-api-access-hg9c5\") pod \"frr-k8s-webhook-server-78b44bf5bb-8f7bl\" (UID: \"acf9f85d-e0ff-419b-8b56-ceda8ffeb28a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.045594 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72e0c267-99f7-4d36-83e8-219560a63667-cert\") pod \"controller-69bbfbf88f-7k7sc\" (UID: \"72e0c267-99f7-4d36-83e8-219560a63667\") " pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.045706 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggckf\" (UniqueName: \"kubernetes.io/projected/72e0c267-99f7-4d36-83e8-219560a63667-kube-api-access-ggckf\") pod \"controller-69bbfbf88f-7k7sc\" (UID: \"72e0c267-99f7-4d36-83e8-219560a63667\") " pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.045742 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zfw5\" (UniqueName: \"kubernetes.io/projected/0179ef66-9a6c-440d-8606-f2f040fe7b44-kube-api-access-2zfw5\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.045798 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0179ef66-9a6c-440d-8606-f2f040fe7b44-memberlist\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.045855 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72e0c267-99f7-4d36-83e8-219560a63667-metrics-certs\") pod \"controller-69bbfbf88f-7k7sc\" (UID: \"72e0c267-99f7-4d36-83e8-219560a63667\") " pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.045915 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0179ef66-9a6c-440d-8606-f2f040fe7b44-metallb-excludel2\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.045961 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0179ef66-9a6c-440d-8606-f2f040fe7b44-metrics-certs\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:50 crc kubenswrapper[4717]: E0217 15:06:50.046063 4717 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 15:06:50 crc kubenswrapper[4717]: E0217 15:06:50.046188 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0179ef66-9a6c-440d-8606-f2f040fe7b44-memberlist podName:0179ef66-9a6c-440d-8606-f2f040fe7b44 nodeName:}" failed. No retries permitted until 2026-02-17 15:06:50.546161624 +0000 UTC m=+876.962002320 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0179ef66-9a6c-440d-8606-f2f040fe7b44-memberlist") pod "speaker-vc7q2" (UID: "0179ef66-9a6c-440d-8606-f2f040fe7b44") : secret "metallb-memberlist" not found Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.046897 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0179ef66-9a6c-440d-8606-f2f040fe7b44-metallb-excludel2\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.048334 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.050489 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-njs4q" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.050641 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0179ef66-9a6c-440d-8606-f2f040fe7b44-metrics-certs\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.052693 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72e0c267-99f7-4d36-83e8-219560a63667-metrics-certs\") pod \"controller-69bbfbf88f-7k7sc\" (UID: \"72e0c267-99f7-4d36-83e8-219560a63667\") " pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.058551 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.061766 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72e0c267-99f7-4d36-83e8-219560a63667-cert\") pod \"controller-69bbfbf88f-7k7sc\" (UID: \"72e0c267-99f7-4d36-83e8-219560a63667\") " pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.063105 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggckf\" (UniqueName: \"kubernetes.io/projected/72e0c267-99f7-4d36-83e8-219560a63667-kube-api-access-ggckf\") pod \"controller-69bbfbf88f-7k7sc\" (UID: \"72e0c267-99f7-4d36-83e8-219560a63667\") " pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.066525 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zfw5\" (UniqueName: \"kubernetes.io/projected/0179ef66-9a6c-440d-8606-f2f040fe7b44-kube-api-access-2zfw5\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.149345 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.509665 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl"] Feb 17 15:06:50 crc kubenswrapper[4717]: W0217 15:06:50.517472 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacf9f85d_e0ff_419b_8b56_ceda8ffeb28a.slice/crio-f1e251a115dcc722b40154e98f4444facc8d7250c73b2133ea745236dc55937c WatchSource:0}: Error finding container f1e251a115dcc722b40154e98f4444facc8d7250c73b2133ea745236dc55937c: Status 404 returned error can't find the container with id f1e251a115dcc722b40154e98f4444facc8d7250c73b2133ea745236dc55937c Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.554665 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0179ef66-9a6c-440d-8606-f2f040fe7b44-memberlist\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:50 crc kubenswrapper[4717]: E0217 15:06:50.554871 4717 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 15:06:50 crc kubenswrapper[4717]: E0217 15:06:50.554968 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0179ef66-9a6c-440d-8606-f2f040fe7b44-memberlist podName:0179ef66-9a6c-440d-8606-f2f040fe7b44 nodeName:}" failed. No retries permitted until 2026-02-17 15:06:51.554942934 +0000 UTC m=+877.970783410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0179ef66-9a6c-440d-8606-f2f040fe7b44-memberlist") pod "speaker-vc7q2" (UID: "0179ef66-9a6c-440d-8606-f2f040fe7b44") : secret "metallb-memberlist" not found Feb 17 15:06:50 crc kubenswrapper[4717]: I0217 15:06:50.599302 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-7k7sc"] Feb 17 15:06:50 crc kubenswrapper[4717]: W0217 15:06:50.603061 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72e0c267_99f7_4d36_83e8_219560a63667.slice/crio-19dee4f46b0e8d1d3d4bd68c58b49ea3533d19529077d9ee1c2fb6f1742698c9 WatchSource:0}: Error finding container 19dee4f46b0e8d1d3d4bd68c58b49ea3533d19529077d9ee1c2fb6f1742698c9: Status 404 returned error can't find the container with id 19dee4f46b0e8d1d3d4bd68c58b49ea3533d19529077d9ee1c2fb6f1742698c9 Feb 17 15:06:51 crc kubenswrapper[4717]: I0217 15:06:51.210039 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" event={"ID":"acf9f85d-e0ff-419b-8b56-ceda8ffeb28a","Type":"ContainerStarted","Data":"f1e251a115dcc722b40154e98f4444facc8d7250c73b2133ea745236dc55937c"} Feb 17 15:06:51 crc kubenswrapper[4717]: I0217 15:06:51.212764 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-7k7sc" event={"ID":"72e0c267-99f7-4d36-83e8-219560a63667","Type":"ContainerStarted","Data":"724ab63fa6788d48e2ecf3af1f42d89988533d68bc4155fd470983afa309ff1e"} Feb 17 15:06:51 crc kubenswrapper[4717]: I0217 15:06:51.212794 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-7k7sc" event={"ID":"72e0c267-99f7-4d36-83e8-219560a63667","Type":"ContainerStarted","Data":"1dd3af7953ba50d6f300fca11a48a3c9faf6e0b48d515fade37765ca25fe3d78"} Feb 17 15:06:51 crc kubenswrapper[4717]: I0217 15:06:51.212807 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-7k7sc" event={"ID":"72e0c267-99f7-4d36-83e8-219560a63667","Type":"ContainerStarted","Data":"19dee4f46b0e8d1d3d4bd68c58b49ea3533d19529077d9ee1c2fb6f1742698c9"} Feb 17 15:06:51 crc kubenswrapper[4717]: I0217 15:06:51.213041 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:06:51 crc kubenswrapper[4717]: I0217 15:06:51.214025 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-njs4q" event={"ID":"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321","Type":"ContainerStarted","Data":"74734ee2db42759bceb96404b2859669755994dd928555cd03346940fb2646b2"} Feb 17 15:06:51 crc kubenswrapper[4717]: I0217 15:06:51.232965 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-7k7sc" podStartSLOduration=2.23294497 podStartE2EDuration="2.23294497s" podCreationTimestamp="2026-02-17 15:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:06:51.228564445 +0000 UTC m=+877.644404961" watchObservedRunningTime="2026-02-17 15:06:51.23294497 +0000 UTC m=+877.648785446" Feb 17 15:06:51 crc kubenswrapper[4717]: I0217 15:06:51.570270 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0179ef66-9a6c-440d-8606-f2f040fe7b44-memberlist\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:51 crc kubenswrapper[4717]: I0217 15:06:51.577688 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0179ef66-9a6c-440d-8606-f2f040fe7b44-memberlist\") pod \"speaker-vc7q2\" (UID: \"0179ef66-9a6c-440d-8606-f2f040fe7b44\") " pod="metallb-system/speaker-vc7q2" Feb 17 15:06:51 crc kubenswrapper[4717]: I0217 15:06:51.659892 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vc7q2" Feb 17 15:06:52 crc kubenswrapper[4717]: I0217 15:06:52.223585 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vc7q2" event={"ID":"0179ef66-9a6c-440d-8606-f2f040fe7b44","Type":"ContainerStarted","Data":"3219e142a7e68adf55e18be2b332dac84e6452c177ad3703aabe459e73012ff5"} Feb 17 15:06:52 crc kubenswrapper[4717]: I0217 15:06:52.223952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vc7q2" event={"ID":"0179ef66-9a6c-440d-8606-f2f040fe7b44","Type":"ContainerStarted","Data":"74964c58bd48928415046ae208c56c7e16f1406a546edfecaef55902f49e83e4"} Feb 17 15:06:53 crc kubenswrapper[4717]: I0217 15:06:53.234888 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vc7q2" event={"ID":"0179ef66-9a6c-440d-8606-f2f040fe7b44","Type":"ContainerStarted","Data":"ae880db93c9746b9e8894bcff62198a630e77a46ea763835bff97a683fa50fee"} Feb 17 15:06:53 crc kubenswrapper[4717]: I0217 15:06:53.235317 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-vc7q2" Feb 17 15:06:53 crc kubenswrapper[4717]: I0217 15:06:53.266919 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-vc7q2" podStartSLOduration=4.266897216 podStartE2EDuration="4.266897216s" podCreationTimestamp="2026-02-17 15:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:06:53.263349245 +0000 UTC m=+879.679189721" watchObservedRunningTime="2026-02-17 15:06:53.266897216 +0000 UTC m=+879.682737702" Feb 17 15:06:58 crc kubenswrapper[4717]: I0217 15:06:58.275904 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" event={"ID":"acf9f85d-e0ff-419b-8b56-ceda8ffeb28a","Type":"ContainerStarted","Data":"7bd2c4b649d0a6308437cb3f82a10a0a27f5b937b261080b5147b16b2ecfc432"} Feb 17 15:06:58 crc kubenswrapper[4717]: I0217 15:06:58.276375 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" Feb 17 15:06:58 crc kubenswrapper[4717]: I0217 15:06:58.279123 4717 generic.go:334] "Generic (PLEG): container finished" podID="fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321" containerID="801df13cb5b68106a1d4d34ad0a8a33e38792c685b62a7d799ede6da7cde3d6b" exitCode=0 Feb 17 15:06:58 crc kubenswrapper[4717]: I0217 15:06:58.279180 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-njs4q" event={"ID":"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321","Type":"ContainerDied","Data":"801df13cb5b68106a1d4d34ad0a8a33e38792c685b62a7d799ede6da7cde3d6b"} Feb 17 15:06:58 crc kubenswrapper[4717]: I0217 15:06:58.306828 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" podStartSLOduration=2.410792993 podStartE2EDuration="9.306797071s" podCreationTimestamp="2026-02-17 15:06:49 +0000 UTC" firstStartedPulling="2026-02-17 15:06:50.519561585 +0000 UTC m=+876.935402061" lastFinishedPulling="2026-02-17 15:06:57.415565663 +0000 UTC m=+883.831406139" observedRunningTime="2026-02-17 15:06:58.298912766 +0000 UTC m=+884.714753282" watchObservedRunningTime="2026-02-17 15:06:58.306797071 +0000 UTC m=+884.722637587" Feb 17 15:06:59 crc kubenswrapper[4717]: I0217 15:06:59.286911 4717 generic.go:334] "Generic (PLEG): container finished" podID="fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321" containerID="ec3026cac86b0032468133e534f32f1ddc081c513d0988d46f23de4b6082777a" exitCode=0 Feb 17 15:06:59 crc kubenswrapper[4717]: I0217 15:06:59.287033 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-njs4q" event={"ID":"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321","Type":"ContainerDied","Data":"ec3026cac86b0032468133e534f32f1ddc081c513d0988d46f23de4b6082777a"} Feb 17 15:07:00 crc kubenswrapper[4717]: I0217 15:07:00.153891 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-7k7sc" Feb 17 15:07:00 crc kubenswrapper[4717]: I0217 15:07:00.297067 4717 generic.go:334] "Generic (PLEG): container finished" podID="fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321" containerID="27bd904d95d10e7e61c680114635416051c3292a310d043beff62c9bfb663d97" exitCode=0 Feb 17 15:07:00 crc kubenswrapper[4717]: I0217 15:07:00.297839 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-njs4q" event={"ID":"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321","Type":"ContainerDied","Data":"27bd904d95d10e7e61c680114635416051c3292a310d043beff62c9bfb663d97"} Feb 17 15:07:01 crc kubenswrapper[4717]: I0217 15:07:01.326260 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-njs4q" event={"ID":"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321","Type":"ContainerStarted","Data":"6f4fd76024bc6fc58e16c3123757837b5471ec2c0c24cc3cc7610b7de63747da"} Feb 17 15:07:01 crc kubenswrapper[4717]: I0217 15:07:01.326304 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-njs4q" event={"ID":"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321","Type":"ContainerStarted","Data":"d452f011a31c9d8c8eca1881a13efd825ac842090560cf3c1344a97caadbc01e"} Feb 17 15:07:01 crc kubenswrapper[4717]: I0217 15:07:01.326313 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-njs4q" event={"ID":"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321","Type":"ContainerStarted","Data":"c1fb64a8edf660694b6f18120ddbee37d9e800e7c36d59d60fdebe98167fbb2d"} Feb 17 15:07:01 crc kubenswrapper[4717]: I0217 15:07:01.326322 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-njs4q" event={"ID":"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321","Type":"ContainerStarted","Data":"e6d2c0bb529e705740a2b478b8833660455bfeab6c2f2ebaf1366ec9ec261a40"} Feb 17 15:07:01 crc kubenswrapper[4717]: I0217 15:07:01.326332 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-njs4q" event={"ID":"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321","Type":"ContainerStarted","Data":"48b699bf4d64246e1fa4a4609700dce846224fbdb97f3b6823637c877130a970"} Feb 17 15:07:01 crc kubenswrapper[4717]: I0217 15:07:01.326341 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-njs4q" event={"ID":"fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321","Type":"ContainerStarted","Data":"fe36eed21029248bcd2961958b22d9866ddcc5fd07f6b4babd1f4a72303e9631"} Feb 17 15:07:02 crc kubenswrapper[4717]: I0217 15:07:02.334457 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-njs4q" Feb 17 15:07:05 crc kubenswrapper[4717]: I0217 15:07:05.050833 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-njs4q" Feb 17 15:07:05 crc kubenswrapper[4717]: I0217 15:07:05.096431 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-njs4q" Feb 17 15:07:05 crc kubenswrapper[4717]: I0217 15:07:05.127281 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-njs4q" podStartSLOduration=8.899151816 podStartE2EDuration="16.127264916s" podCreationTimestamp="2026-02-17 15:06:49 +0000 UTC" firstStartedPulling="2026-02-17 15:06:50.20275598 +0000 UTC m=+876.618596456" lastFinishedPulling="2026-02-17 15:06:57.43086908 +0000 UTC m=+883.846709556" observedRunningTime="2026-02-17 15:07:02.369954429 +0000 UTC m=+888.785794905" watchObservedRunningTime="2026-02-17 15:07:05.127264916 +0000 UTC m=+891.543105392" Feb 17 15:07:10 crc kubenswrapper[4717]: I0217 15:07:10.054510 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-njs4q" Feb 17 15:07:10 crc kubenswrapper[4717]: I0217 15:07:10.061990 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8f7bl" Feb 17 15:07:11 crc kubenswrapper[4717]: I0217 15:07:11.663234 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-vc7q2" Feb 17 15:07:14 crc kubenswrapper[4717]: I0217 15:07:14.278543 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fc5qj"] Feb 17 15:07:14 crc kubenswrapper[4717]: I0217 15:07:14.279791 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fc5qj" Feb 17 15:07:14 crc kubenswrapper[4717]: I0217 15:07:14.282627 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-xtjk2" Feb 17 15:07:14 crc kubenswrapper[4717]: I0217 15:07:14.282979 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 17 15:07:14 crc kubenswrapper[4717]: I0217 15:07:14.296178 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 17 15:07:14 crc kubenswrapper[4717]: I0217 15:07:14.299930 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fc5qj"] Feb 17 15:07:14 crc kubenswrapper[4717]: I0217 15:07:14.429380 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7vhj\" (UniqueName: \"kubernetes.io/projected/61b00c35-b6c8-493d-a8bd-d8f85fd3fb94-kube-api-access-w7vhj\") pod \"openstack-operator-index-fc5qj\" (UID: \"61b00c35-b6c8-493d-a8bd-d8f85fd3fb94\") " pod="openstack-operators/openstack-operator-index-fc5qj" Feb 17 15:07:14 crc kubenswrapper[4717]: I0217 15:07:14.531293 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7vhj\" (UniqueName: \"kubernetes.io/projected/61b00c35-b6c8-493d-a8bd-d8f85fd3fb94-kube-api-access-w7vhj\") pod \"openstack-operator-index-fc5qj\" (UID: \"61b00c35-b6c8-493d-a8bd-d8f85fd3fb94\") " pod="openstack-operators/openstack-operator-index-fc5qj" Feb 17 15:07:14 crc kubenswrapper[4717]: I0217 15:07:14.562960 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7vhj\" (UniqueName: \"kubernetes.io/projected/61b00c35-b6c8-493d-a8bd-d8f85fd3fb94-kube-api-access-w7vhj\") pod \"openstack-operator-index-fc5qj\" (UID: \"61b00c35-b6c8-493d-a8bd-d8f85fd3fb94\") " pod="openstack-operators/openstack-operator-index-fc5qj" Feb 17 15:07:14 crc kubenswrapper[4717]: I0217 15:07:14.607007 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fc5qj" Feb 17 15:07:15 crc kubenswrapper[4717]: I0217 15:07:15.009245 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fc5qj"] Feb 17 15:07:15 crc kubenswrapper[4717]: I0217 15:07:15.418646 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fc5qj" event={"ID":"61b00c35-b6c8-493d-a8bd-d8f85fd3fb94","Type":"ContainerStarted","Data":"4bb67ab6b6d7d22613541201a1c8934f2d8a17dc75965a3aae56f239f9db4aa7"} Feb 17 15:07:17 crc kubenswrapper[4717]: I0217 15:07:17.459044 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-fc5qj"] Feb 17 15:07:18 crc kubenswrapper[4717]: I0217 15:07:18.064458 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jpq9r"] Feb 17 15:07:18 crc kubenswrapper[4717]: I0217 15:07:18.065812 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jpq9r" Feb 17 15:07:18 crc kubenswrapper[4717]: I0217 15:07:18.082939 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jpq9r"] Feb 17 15:07:18 crc kubenswrapper[4717]: I0217 15:07:18.190463 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n4g9\" (UniqueName: \"kubernetes.io/projected/e69d57ba-48b1-48d9-b658-c0a86cf05ab4-kube-api-access-8n4g9\") pod \"openstack-operator-index-jpq9r\" (UID: \"e69d57ba-48b1-48d9-b658-c0a86cf05ab4\") " pod="openstack-operators/openstack-operator-index-jpq9r" Feb 17 15:07:18 crc kubenswrapper[4717]: I0217 15:07:18.291553 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n4g9\" (UniqueName: \"kubernetes.io/projected/e69d57ba-48b1-48d9-b658-c0a86cf05ab4-kube-api-access-8n4g9\") pod \"openstack-operator-index-jpq9r\" (UID: \"e69d57ba-48b1-48d9-b658-c0a86cf05ab4\") " pod="openstack-operators/openstack-operator-index-jpq9r" Feb 17 15:07:18 crc kubenswrapper[4717]: I0217 15:07:18.312666 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n4g9\" (UniqueName: \"kubernetes.io/projected/e69d57ba-48b1-48d9-b658-c0a86cf05ab4-kube-api-access-8n4g9\") pod \"openstack-operator-index-jpq9r\" (UID: \"e69d57ba-48b1-48d9-b658-c0a86cf05ab4\") " pod="openstack-operators/openstack-operator-index-jpq9r" Feb 17 15:07:18 crc kubenswrapper[4717]: I0217 15:07:18.402340 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jpq9r" Feb 17 15:07:19 crc kubenswrapper[4717]: I0217 15:07:19.089663 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jpq9r"] Feb 17 15:07:19 crc kubenswrapper[4717]: W0217 15:07:19.098527 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode69d57ba_48b1_48d9_b658_c0a86cf05ab4.slice/crio-a7403230f42f8521dc270dde63bfe012dc058b8d07993283e5feae6e670178e3 WatchSource:0}: Error finding container a7403230f42f8521dc270dde63bfe012dc058b8d07993283e5feae6e670178e3: Status 404 returned error can't find the container with id a7403230f42f8521dc270dde63bfe012dc058b8d07993283e5feae6e670178e3 Feb 17 15:07:19 crc kubenswrapper[4717]: I0217 15:07:19.449859 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fc5qj" event={"ID":"61b00c35-b6c8-493d-a8bd-d8f85fd3fb94","Type":"ContainerStarted","Data":"3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be"} Feb 17 15:07:19 crc kubenswrapper[4717]: I0217 15:07:19.450010 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-fc5qj" podUID="61b00c35-b6c8-493d-a8bd-d8f85fd3fb94" containerName="registry-server" containerID="cri-o://3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be" gracePeriod=2 Feb 17 15:07:19 crc kubenswrapper[4717]: I0217 15:07:19.453208 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jpq9r" event={"ID":"e69d57ba-48b1-48d9-b658-c0a86cf05ab4","Type":"ContainerStarted","Data":"d2f2d39b09c442c8272bf6ae2e10468a2f9096a6651193807282453a4ac3d471"} Feb 17 15:07:19 crc kubenswrapper[4717]: I0217 15:07:19.453295 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jpq9r" event={"ID":"e69d57ba-48b1-48d9-b658-c0a86cf05ab4","Type":"ContainerStarted","Data":"a7403230f42f8521dc270dde63bfe012dc058b8d07993283e5feae6e670178e3"} Feb 17 15:07:19 crc kubenswrapper[4717]: I0217 15:07:19.471925 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fc5qj" podStartSLOduration=1.773295302 podStartE2EDuration="5.471909604s" podCreationTimestamp="2026-02-17 15:07:14 +0000 UTC" firstStartedPulling="2026-02-17 15:07:15.021657795 +0000 UTC m=+901.437498271" lastFinishedPulling="2026-02-17 15:07:18.720272097 +0000 UTC m=+905.136112573" observedRunningTime="2026-02-17 15:07:19.470797812 +0000 UTC m=+905.886638308" watchObservedRunningTime="2026-02-17 15:07:19.471909604 +0000 UTC m=+905.887750080" Feb 17 15:07:19 crc kubenswrapper[4717]: I0217 15:07:19.866405 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fc5qj" Feb 17 15:07:19 crc kubenswrapper[4717]: I0217 15:07:19.882553 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jpq9r" podStartSLOduration=1.835291277 podStartE2EDuration="1.882531105s" podCreationTimestamp="2026-02-17 15:07:18 +0000 UTC" firstStartedPulling="2026-02-17 15:07:19.105480793 +0000 UTC m=+905.521321269" lastFinishedPulling="2026-02-17 15:07:19.152720611 +0000 UTC m=+905.568561097" observedRunningTime="2026-02-17 15:07:19.497702429 +0000 UTC m=+905.913542925" watchObservedRunningTime="2026-02-17 15:07:19.882531105 +0000 UTC m=+906.298371581" Feb 17 15:07:20 crc kubenswrapper[4717]: I0217 15:07:20.014555 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7vhj\" (UniqueName: \"kubernetes.io/projected/61b00c35-b6c8-493d-a8bd-d8f85fd3fb94-kube-api-access-w7vhj\") pod \"61b00c35-b6c8-493d-a8bd-d8f85fd3fb94\" (UID: \"61b00c35-b6c8-493d-a8bd-d8f85fd3fb94\") " Feb 17 15:07:20 crc kubenswrapper[4717]: I0217 15:07:20.020852 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61b00c35-b6c8-493d-a8bd-d8f85fd3fb94-kube-api-access-w7vhj" (OuterVolumeSpecName: "kube-api-access-w7vhj") pod "61b00c35-b6c8-493d-a8bd-d8f85fd3fb94" (UID: "61b00c35-b6c8-493d-a8bd-d8f85fd3fb94"). InnerVolumeSpecName "kube-api-access-w7vhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:07:20 crc kubenswrapper[4717]: I0217 15:07:20.116871 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7vhj\" (UniqueName: \"kubernetes.io/projected/61b00c35-b6c8-493d-a8bd-d8f85fd3fb94-kube-api-access-w7vhj\") on node \"crc\" DevicePath \"\"" Feb 17 15:07:20 crc kubenswrapper[4717]: I0217 15:07:20.462150 4717 generic.go:334] "Generic (PLEG): container finished" podID="61b00c35-b6c8-493d-a8bd-d8f85fd3fb94" containerID="3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be" exitCode=0 Feb 17 15:07:20 crc kubenswrapper[4717]: I0217 15:07:20.462223 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fc5qj" event={"ID":"61b00c35-b6c8-493d-a8bd-d8f85fd3fb94","Type":"ContainerDied","Data":"3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be"} Feb 17 15:07:20 crc kubenswrapper[4717]: I0217 15:07:20.462247 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fc5qj" Feb 17 15:07:20 crc kubenswrapper[4717]: I0217 15:07:20.462427 4717 scope.go:117] "RemoveContainer" containerID="3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be" Feb 17 15:07:20 crc kubenswrapper[4717]: I0217 15:07:20.462350 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fc5qj" event={"ID":"61b00c35-b6c8-493d-a8bd-d8f85fd3fb94","Type":"ContainerDied","Data":"4bb67ab6b6d7d22613541201a1c8934f2d8a17dc75965a3aae56f239f9db4aa7"} Feb 17 15:07:20 crc kubenswrapper[4717]: I0217 15:07:20.486111 4717 scope.go:117] "RemoveContainer" containerID="3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be" Feb 17 15:07:20 crc kubenswrapper[4717]: E0217 15:07:20.488409 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be\": container with ID starting with 3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be not found: ID does not exist" containerID="3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be" Feb 17 15:07:20 crc kubenswrapper[4717]: I0217 15:07:20.488477 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be"} err="failed to get container status \"3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be\": rpc error: code = NotFound desc = could not find container \"3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be\": container with ID starting with 3c99ae6990f2145f8bc2e94ec229689930a1a814ea053cfb5582c03713cd04be not found: ID does not exist" Feb 17 15:07:20 crc kubenswrapper[4717]: I0217 15:07:20.508503 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-fc5qj"] Feb 17 15:07:20 crc kubenswrapper[4717]: I0217 15:07:20.515631 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-fc5qj"] Feb 17 15:07:21 crc kubenswrapper[4717]: I0217 15:07:21.853961 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61b00c35-b6c8-493d-a8bd-d8f85fd3fb94" path="/var/lib/kubelet/pods/61b00c35-b6c8-493d-a8bd-d8f85fd3fb94/volumes" Feb 17 15:07:28 crc kubenswrapper[4717]: I0217 15:07:28.403347 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-jpq9r" Feb 17 15:07:28 crc kubenswrapper[4717]: I0217 15:07:28.403725 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-jpq9r" Feb 17 15:07:28 crc kubenswrapper[4717]: I0217 15:07:28.433632 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-jpq9r" Feb 17 15:07:28 crc kubenswrapper[4717]: I0217 15:07:28.584738 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-jpq9r" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.299463 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb"] Feb 17 15:07:31 crc kubenswrapper[4717]: E0217 15:07:31.300144 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b00c35-b6c8-493d-a8bd-d8f85fd3fb94" containerName="registry-server" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.300170 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b00c35-b6c8-493d-a8bd-d8f85fd3fb94" containerName="registry-server" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.300429 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b00c35-b6c8-493d-a8bd-d8f85fd3fb94" containerName="registry-server" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.301780 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.334678 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-d6z2f" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.342472 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb"] Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.402164 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af730b93-e04c-4361-9e27-67ed0596569a-util\") pod \"5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb\" (UID: \"af730b93-e04c-4361-9e27-67ed0596569a\") " pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.402578 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2v8r\" (UniqueName: \"kubernetes.io/projected/af730b93-e04c-4361-9e27-67ed0596569a-kube-api-access-w2v8r\") pod \"5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb\" (UID: \"af730b93-e04c-4361-9e27-67ed0596569a\") " pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.402775 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af730b93-e04c-4361-9e27-67ed0596569a-bundle\") pod \"5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb\" (UID: \"af730b93-e04c-4361-9e27-67ed0596569a\") " pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.503786 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af730b93-e04c-4361-9e27-67ed0596569a-bundle\") pod \"5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb\" (UID: \"af730b93-e04c-4361-9e27-67ed0596569a\") " pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.504051 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af730b93-e04c-4361-9e27-67ed0596569a-util\") pod \"5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb\" (UID: \"af730b93-e04c-4361-9e27-67ed0596569a\") " pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.504230 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2v8r\" (UniqueName: \"kubernetes.io/projected/af730b93-e04c-4361-9e27-67ed0596569a-kube-api-access-w2v8r\") pod \"5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb\" (UID: \"af730b93-e04c-4361-9e27-67ed0596569a\") " pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.504358 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af730b93-e04c-4361-9e27-67ed0596569a-bundle\") pod \"5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb\" (UID: \"af730b93-e04c-4361-9e27-67ed0596569a\") " pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.504495 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af730b93-e04c-4361-9e27-67ed0596569a-util\") pod \"5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb\" (UID: \"af730b93-e04c-4361-9e27-67ed0596569a\") " pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.523046 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2v8r\" (UniqueName: \"kubernetes.io/projected/af730b93-e04c-4361-9e27-67ed0596569a-kube-api-access-w2v8r\") pod \"5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb\" (UID: \"af730b93-e04c-4361-9e27-67ed0596569a\") " pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:31 crc kubenswrapper[4717]: I0217 15:07:31.622950 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:32 crc kubenswrapper[4717]: I0217 15:07:32.037431 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb"] Feb 17 15:07:32 crc kubenswrapper[4717]: I0217 15:07:32.540866 4717 generic.go:334] "Generic (PLEG): container finished" podID="af730b93-e04c-4361-9e27-67ed0596569a" containerID="4cc693fa19b8a65b54d318966449df846a155ce14f0e460e41ee46e090c51fd1" exitCode=0 Feb 17 15:07:32 crc kubenswrapper[4717]: I0217 15:07:32.540916 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" event={"ID":"af730b93-e04c-4361-9e27-67ed0596569a","Type":"ContainerDied","Data":"4cc693fa19b8a65b54d318966449df846a155ce14f0e460e41ee46e090c51fd1"} Feb 17 15:07:32 crc kubenswrapper[4717]: I0217 15:07:32.540949 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" event={"ID":"af730b93-e04c-4361-9e27-67ed0596569a","Type":"ContainerStarted","Data":"fdf20c5a4be54c928ca35071e109ce5bd7d3afcffee1d6fe73f9209d91d5c4ec"} Feb 17 15:07:33 crc kubenswrapper[4717]: I0217 15:07:33.549977 4717 generic.go:334] "Generic (PLEG): container finished" podID="af730b93-e04c-4361-9e27-67ed0596569a" containerID="ddcf1ec4168c0783feeca2fb6c557c9e2d1071f5a7c01d3b555deb6f04f2bc69" exitCode=0 Feb 17 15:07:33 crc kubenswrapper[4717]: I0217 15:07:33.550140 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" event={"ID":"af730b93-e04c-4361-9e27-67ed0596569a","Type":"ContainerDied","Data":"ddcf1ec4168c0783feeca2fb6c557c9e2d1071f5a7c01d3b555deb6f04f2bc69"} Feb 17 15:07:34 crc kubenswrapper[4717]: I0217 15:07:34.573152 4717 generic.go:334] "Generic (PLEG): container finished" podID="af730b93-e04c-4361-9e27-67ed0596569a" containerID="7e1b13c6ab85030fd63ec908553a49a547debd24adeec581bed9cfe2fd613346" exitCode=0 Feb 17 15:07:34 crc kubenswrapper[4717]: I0217 15:07:34.573606 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" event={"ID":"af730b93-e04c-4361-9e27-67ed0596569a","Type":"ContainerDied","Data":"7e1b13c6ab85030fd63ec908553a49a547debd24adeec581bed9cfe2fd613346"} Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.066679 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j9h4n"] Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.068389 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.083339 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j9h4n"] Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.158742 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eac6901-09ad-43c5-90d2-ee6f1f23b268-utilities\") pod \"community-operators-j9h4n\" (UID: \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\") " pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.158852 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eac6901-09ad-43c5-90d2-ee6f1f23b268-catalog-content\") pod \"community-operators-j9h4n\" (UID: \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\") " pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.158959 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m56n2\" (UniqueName: \"kubernetes.io/projected/4eac6901-09ad-43c5-90d2-ee6f1f23b268-kube-api-access-m56n2\") pod \"community-operators-j9h4n\" (UID: \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\") " pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.260100 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eac6901-09ad-43c5-90d2-ee6f1f23b268-catalog-content\") pod \"community-operators-j9h4n\" (UID: \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\") " pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.260396 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m56n2\" (UniqueName: \"kubernetes.io/projected/4eac6901-09ad-43c5-90d2-ee6f1f23b268-kube-api-access-m56n2\") pod \"community-operators-j9h4n\" (UID: \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\") " pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.260503 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eac6901-09ad-43c5-90d2-ee6f1f23b268-utilities\") pod \"community-operators-j9h4n\" (UID: \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\") " pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.260787 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eac6901-09ad-43c5-90d2-ee6f1f23b268-catalog-content\") pod \"community-operators-j9h4n\" (UID: \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\") " pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.261053 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eac6901-09ad-43c5-90d2-ee6f1f23b268-utilities\") pod \"community-operators-j9h4n\" (UID: \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\") " pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.278803 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m56n2\" (UniqueName: \"kubernetes.io/projected/4eac6901-09ad-43c5-90d2-ee6f1f23b268-kube-api-access-m56n2\") pod \"community-operators-j9h4n\" (UID: \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\") " pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.395057 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.867165 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.912908 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j9h4n"] Feb 17 15:07:35 crc kubenswrapper[4717]: W0217 15:07:35.917644 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eac6901_09ad_43c5_90d2_ee6f1f23b268.slice/crio-454ac58cb819fde00bf3b816311d3a0bbecfbeef6e132b5ee3cf0c08cb7a349d WatchSource:0}: Error finding container 454ac58cb819fde00bf3b816311d3a0bbecfbeef6e132b5ee3cf0c08cb7a349d: Status 404 returned error can't find the container with id 454ac58cb819fde00bf3b816311d3a0bbecfbeef6e132b5ee3cf0c08cb7a349d Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.971402 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2v8r\" (UniqueName: \"kubernetes.io/projected/af730b93-e04c-4361-9e27-67ed0596569a-kube-api-access-w2v8r\") pod \"af730b93-e04c-4361-9e27-67ed0596569a\" (UID: \"af730b93-e04c-4361-9e27-67ed0596569a\") " Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.971585 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af730b93-e04c-4361-9e27-67ed0596569a-util\") pod \"af730b93-e04c-4361-9e27-67ed0596569a\" (UID: \"af730b93-e04c-4361-9e27-67ed0596569a\") " Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.971633 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af730b93-e04c-4361-9e27-67ed0596569a-bundle\") pod \"af730b93-e04c-4361-9e27-67ed0596569a\" (UID: \"af730b93-e04c-4361-9e27-67ed0596569a\") " Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.972744 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af730b93-e04c-4361-9e27-67ed0596569a-bundle" (OuterVolumeSpecName: "bundle") pod "af730b93-e04c-4361-9e27-67ed0596569a" (UID: "af730b93-e04c-4361-9e27-67ed0596569a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.976408 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af730b93-e04c-4361-9e27-67ed0596569a-kube-api-access-w2v8r" (OuterVolumeSpecName: "kube-api-access-w2v8r") pod "af730b93-e04c-4361-9e27-67ed0596569a" (UID: "af730b93-e04c-4361-9e27-67ed0596569a"). InnerVolumeSpecName "kube-api-access-w2v8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:07:35 crc kubenswrapper[4717]: I0217 15:07:35.985292 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af730b93-e04c-4361-9e27-67ed0596569a-util" (OuterVolumeSpecName: "util") pod "af730b93-e04c-4361-9e27-67ed0596569a" (UID: "af730b93-e04c-4361-9e27-67ed0596569a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:07:36 crc kubenswrapper[4717]: I0217 15:07:36.073531 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af730b93-e04c-4361-9e27-67ed0596569a-util\") on node \"crc\" DevicePath \"\"" Feb 17 15:07:36 crc kubenswrapper[4717]: I0217 15:07:36.073570 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af730b93-e04c-4361-9e27-67ed0596569a-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:07:36 crc kubenswrapper[4717]: I0217 15:07:36.073584 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2v8r\" (UniqueName: \"kubernetes.io/projected/af730b93-e04c-4361-9e27-67ed0596569a-kube-api-access-w2v8r\") on node \"crc\" DevicePath \"\"" Feb 17 15:07:36 crc kubenswrapper[4717]: I0217 15:07:36.597829 4717 generic.go:334] "Generic (PLEG): container finished" podID="4eac6901-09ad-43c5-90d2-ee6f1f23b268" containerID="3ad285a657c2fa38b3b7cc0d9e70cf5329ebf8caf1ae64cf6812be9b49ecf2f2" exitCode=0 Feb 17 15:07:36 crc kubenswrapper[4717]: I0217 15:07:36.597974 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9h4n" event={"ID":"4eac6901-09ad-43c5-90d2-ee6f1f23b268","Type":"ContainerDied","Data":"3ad285a657c2fa38b3b7cc0d9e70cf5329ebf8caf1ae64cf6812be9b49ecf2f2"} Feb 17 15:07:36 crc kubenswrapper[4717]: I0217 15:07:36.598009 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9h4n" event={"ID":"4eac6901-09ad-43c5-90d2-ee6f1f23b268","Type":"ContainerStarted","Data":"454ac58cb819fde00bf3b816311d3a0bbecfbeef6e132b5ee3cf0c08cb7a349d"} Feb 17 15:07:36 crc kubenswrapper[4717]: I0217 15:07:36.601520 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" event={"ID":"af730b93-e04c-4361-9e27-67ed0596569a","Type":"ContainerDied","Data":"fdf20c5a4be54c928ca35071e109ce5bd7d3afcffee1d6fe73f9209d91d5c4ec"} Feb 17 15:07:36 crc kubenswrapper[4717]: I0217 15:07:36.601557 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdf20c5a4be54c928ca35071e109ce5bd7d3afcffee1d6fe73f9209d91d5c4ec" Feb 17 15:07:36 crc kubenswrapper[4717]: I0217 15:07:36.601628 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb" Feb 17 15:07:37 crc kubenswrapper[4717]: I0217 15:07:37.609259 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9h4n" event={"ID":"4eac6901-09ad-43c5-90d2-ee6f1f23b268","Type":"ContainerStarted","Data":"1574653d39bb3b214962679deeb2ac196d0a236b37894ec5a268172b0691a1df"} Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.181544 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc"] Feb 17 15:07:38 crc kubenswrapper[4717]: E0217 15:07:38.181852 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af730b93-e04c-4361-9e27-67ed0596569a" containerName="util" Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.181872 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="af730b93-e04c-4361-9e27-67ed0596569a" containerName="util" Feb 17 15:07:38 crc kubenswrapper[4717]: E0217 15:07:38.181901 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af730b93-e04c-4361-9e27-67ed0596569a" containerName="extract" Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.181910 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="af730b93-e04c-4361-9e27-67ed0596569a" containerName="extract" Feb 17 15:07:38 crc kubenswrapper[4717]: E0217 15:07:38.181920 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af730b93-e04c-4361-9e27-67ed0596569a" containerName="pull" Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.181929 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="af730b93-e04c-4361-9e27-67ed0596569a" containerName="pull" Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.182064 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="af730b93-e04c-4361-9e27-67ed0596569a" containerName="extract" Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.182612 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc" Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.196248 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-cx68z" Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.229559 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc"] Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.316042 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gswd\" (UniqueName: \"kubernetes.io/projected/ece1adcd-6e4d-4cf5-afd4-d108db8df6d6-kube-api-access-2gswd\") pod \"openstack-operator-controller-init-7948dfdc59-ljckc\" (UID: \"ece1adcd-6e4d-4cf5-afd4-d108db8df6d6\") " pod="openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc" Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.417388 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gswd\" (UniqueName: \"kubernetes.io/projected/ece1adcd-6e4d-4cf5-afd4-d108db8df6d6-kube-api-access-2gswd\") pod \"openstack-operator-controller-init-7948dfdc59-ljckc\" (UID: \"ece1adcd-6e4d-4cf5-afd4-d108db8df6d6\") " pod="openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc" Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.437005 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gswd\" (UniqueName: \"kubernetes.io/projected/ece1adcd-6e4d-4cf5-afd4-d108db8df6d6-kube-api-access-2gswd\") pod \"openstack-operator-controller-init-7948dfdc59-ljckc\" (UID: \"ece1adcd-6e4d-4cf5-afd4-d108db8df6d6\") " pod="openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc" Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.525623 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc" Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.629771 4717 generic.go:334] "Generic (PLEG): container finished" podID="4eac6901-09ad-43c5-90d2-ee6f1f23b268" containerID="1574653d39bb3b214962679deeb2ac196d0a236b37894ec5a268172b0691a1df" exitCode=0 Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.629970 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9h4n" event={"ID":"4eac6901-09ad-43c5-90d2-ee6f1f23b268","Type":"ContainerDied","Data":"1574653d39bb3b214962679deeb2ac196d0a236b37894ec5a268172b0691a1df"} Feb 17 15:07:38 crc kubenswrapper[4717]: I0217 15:07:38.751232 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc"] Feb 17 15:07:39 crc kubenswrapper[4717]: I0217 15:07:39.637759 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc" event={"ID":"ece1adcd-6e4d-4cf5-afd4-d108db8df6d6","Type":"ContainerStarted","Data":"ed01589186fc47690a63d20a38e6125a6972d6c5ba30a46fb4486ba74a5829b3"} Feb 17 15:07:39 crc kubenswrapper[4717]: I0217 15:07:39.641212 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9h4n" event={"ID":"4eac6901-09ad-43c5-90d2-ee6f1f23b268","Type":"ContainerStarted","Data":"3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b"} Feb 17 15:07:39 crc kubenswrapper[4717]: I0217 15:07:39.665575 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j9h4n" podStartSLOduration=2.012084031 podStartE2EDuration="4.665553261s" podCreationTimestamp="2026-02-17 15:07:35 +0000 UTC" firstStartedPulling="2026-02-17 15:07:36.600549201 +0000 UTC m=+923.016389677" lastFinishedPulling="2026-02-17 15:07:39.254018431 +0000 UTC m=+925.669858907" observedRunningTime="2026-02-17 15:07:39.65957017 +0000 UTC m=+926.075410656" watchObservedRunningTime="2026-02-17 15:07:39.665553261 +0000 UTC m=+926.081393737" Feb 17 15:07:43 crc kubenswrapper[4717]: I0217 15:07:43.687663 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc" event={"ID":"ece1adcd-6e4d-4cf5-afd4-d108db8df6d6","Type":"ContainerStarted","Data":"a3e5a2409a9ab96246beb2338d655bd07456ce92dd6a057e34daa5af9af29524"} Feb 17 15:07:43 crc kubenswrapper[4717]: I0217 15:07:43.688426 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc" Feb 17 15:07:45 crc kubenswrapper[4717]: I0217 15:07:45.397353 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:45 crc kubenswrapper[4717]: I0217 15:07:45.397410 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:45 crc kubenswrapper[4717]: I0217 15:07:45.454649 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:45 crc kubenswrapper[4717]: I0217 15:07:45.485546 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc" podStartSLOduration=3.457788445 podStartE2EDuration="7.485519717s" podCreationTimestamp="2026-02-17 15:07:38 +0000 UTC" firstStartedPulling="2026-02-17 15:07:38.759648544 +0000 UTC m=+925.175489020" lastFinishedPulling="2026-02-17 15:07:42.787379816 +0000 UTC m=+929.203220292" observedRunningTime="2026-02-17 15:07:43.733679882 +0000 UTC m=+930.149520448" watchObservedRunningTime="2026-02-17 15:07:45.485519717 +0000 UTC m=+931.901360203" Feb 17 15:07:45 crc kubenswrapper[4717]: I0217 15:07:45.762704 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:46 crc kubenswrapper[4717]: I0217 15:07:46.251635 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j9h4n"] Feb 17 15:07:47 crc kubenswrapper[4717]: I0217 15:07:47.717927 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j9h4n" podUID="4eac6901-09ad-43c5-90d2-ee6f1f23b268" containerName="registry-server" containerID="cri-o://3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b" gracePeriod=2 Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.140703 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.272705 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m56n2\" (UniqueName: \"kubernetes.io/projected/4eac6901-09ad-43c5-90d2-ee6f1f23b268-kube-api-access-m56n2\") pod \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\" (UID: \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\") " Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.272792 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eac6901-09ad-43c5-90d2-ee6f1f23b268-catalog-content\") pod \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\" (UID: \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\") " Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.272877 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eac6901-09ad-43c5-90d2-ee6f1f23b268-utilities\") pod \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\" (UID: \"4eac6901-09ad-43c5-90d2-ee6f1f23b268\") " Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.273891 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eac6901-09ad-43c5-90d2-ee6f1f23b268-utilities" (OuterVolumeSpecName: "utilities") pod "4eac6901-09ad-43c5-90d2-ee6f1f23b268" (UID: "4eac6901-09ad-43c5-90d2-ee6f1f23b268"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.281373 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eac6901-09ad-43c5-90d2-ee6f1f23b268-kube-api-access-m56n2" (OuterVolumeSpecName: "kube-api-access-m56n2") pod "4eac6901-09ad-43c5-90d2-ee6f1f23b268" (UID: "4eac6901-09ad-43c5-90d2-ee6f1f23b268"). InnerVolumeSpecName "kube-api-access-m56n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.350068 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eac6901-09ad-43c5-90d2-ee6f1f23b268-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4eac6901-09ad-43c5-90d2-ee6f1f23b268" (UID: "4eac6901-09ad-43c5-90d2-ee6f1f23b268"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.374502 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eac6901-09ad-43c5-90d2-ee6f1f23b268-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.374545 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m56n2\" (UniqueName: \"kubernetes.io/projected/4eac6901-09ad-43c5-90d2-ee6f1f23b268-kube-api-access-m56n2\") on node \"crc\" DevicePath \"\"" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.374561 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eac6901-09ad-43c5-90d2-ee6f1f23b268-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.529810 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7948dfdc59-ljckc" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.727219 4717 generic.go:334] "Generic (PLEG): container finished" podID="4eac6901-09ad-43c5-90d2-ee6f1f23b268" containerID="3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b" exitCode=0 Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.727280 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9h4n" event={"ID":"4eac6901-09ad-43c5-90d2-ee6f1f23b268","Type":"ContainerDied","Data":"3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b"} Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.727299 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j9h4n" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.727323 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9h4n" event={"ID":"4eac6901-09ad-43c5-90d2-ee6f1f23b268","Type":"ContainerDied","Data":"454ac58cb819fde00bf3b816311d3a0bbecfbeef6e132b5ee3cf0c08cb7a349d"} Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.727348 4717 scope.go:117] "RemoveContainer" containerID="3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.756685 4717 scope.go:117] "RemoveContainer" containerID="1574653d39bb3b214962679deeb2ac196d0a236b37894ec5a268172b0691a1df" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.757558 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j9h4n"] Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.761586 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j9h4n"] Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.796957 4717 scope.go:117] "RemoveContainer" containerID="3ad285a657c2fa38b3b7cc0d9e70cf5329ebf8caf1ae64cf6812be9b49ecf2f2" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.816489 4717 scope.go:117] "RemoveContainer" containerID="3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b" Feb 17 15:07:48 crc kubenswrapper[4717]: E0217 15:07:48.817074 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b\": container with ID starting with 3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b not found: ID does not exist" containerID="3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.817149 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b"} err="failed to get container status \"3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b\": rpc error: code = NotFound desc = could not find container \"3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b\": container with ID starting with 3acd07fb9c60012f8db5fe1fb6b6bbf5d6983476d42c3fab2150528854c5566b not found: ID does not exist" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.817207 4717 scope.go:117] "RemoveContainer" containerID="1574653d39bb3b214962679deeb2ac196d0a236b37894ec5a268172b0691a1df" Feb 17 15:07:48 crc kubenswrapper[4717]: E0217 15:07:48.817841 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1574653d39bb3b214962679deeb2ac196d0a236b37894ec5a268172b0691a1df\": container with ID starting with 1574653d39bb3b214962679deeb2ac196d0a236b37894ec5a268172b0691a1df not found: ID does not exist" containerID="1574653d39bb3b214962679deeb2ac196d0a236b37894ec5a268172b0691a1df" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.817875 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1574653d39bb3b214962679deeb2ac196d0a236b37894ec5a268172b0691a1df"} err="failed to get container status \"1574653d39bb3b214962679deeb2ac196d0a236b37894ec5a268172b0691a1df\": rpc error: code = NotFound desc = could not find container \"1574653d39bb3b214962679deeb2ac196d0a236b37894ec5a268172b0691a1df\": container with ID starting with 1574653d39bb3b214962679deeb2ac196d0a236b37894ec5a268172b0691a1df not found: ID does not exist" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.817901 4717 scope.go:117] "RemoveContainer" containerID="3ad285a657c2fa38b3b7cc0d9e70cf5329ebf8caf1ae64cf6812be9b49ecf2f2" Feb 17 15:07:48 crc kubenswrapper[4717]: E0217 15:07:48.818351 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad285a657c2fa38b3b7cc0d9e70cf5329ebf8caf1ae64cf6812be9b49ecf2f2\": container with ID starting with 3ad285a657c2fa38b3b7cc0d9e70cf5329ebf8caf1ae64cf6812be9b49ecf2f2 not found: ID does not exist" containerID="3ad285a657c2fa38b3b7cc0d9e70cf5329ebf8caf1ae64cf6812be9b49ecf2f2" Feb 17 15:07:48 crc kubenswrapper[4717]: I0217 15:07:48.818396 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad285a657c2fa38b3b7cc0d9e70cf5329ebf8caf1ae64cf6812be9b49ecf2f2"} err="failed to get container status \"3ad285a657c2fa38b3b7cc0d9e70cf5329ebf8caf1ae64cf6812be9b49ecf2f2\": rpc error: code = NotFound desc = could not find container \"3ad285a657c2fa38b3b7cc0d9e70cf5329ebf8caf1ae64cf6812be9b49ecf2f2\": container with ID starting with 3ad285a657c2fa38b3b7cc0d9e70cf5329ebf8caf1ae64cf6812be9b49ecf2f2 not found: ID does not exist" Feb 17 15:07:49 crc kubenswrapper[4717]: I0217 15:07:49.857666 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eac6901-09ad-43c5-90d2-ee6f1f23b268" path="/var/lib/kubelet/pods/4eac6901-09ad-43c5-90d2-ee6f1f23b268/volumes" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.143324 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zjljp"] Feb 17 15:07:58 crc kubenswrapper[4717]: E0217 15:07:58.144032 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eac6901-09ad-43c5-90d2-ee6f1f23b268" containerName="extract-utilities" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.144044 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eac6901-09ad-43c5-90d2-ee6f1f23b268" containerName="extract-utilities" Feb 17 15:07:58 crc kubenswrapper[4717]: E0217 15:07:58.144056 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eac6901-09ad-43c5-90d2-ee6f1f23b268" containerName="registry-server" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.144064 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eac6901-09ad-43c5-90d2-ee6f1f23b268" containerName="registry-server" Feb 17 15:07:58 crc kubenswrapper[4717]: E0217 15:07:58.144072 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eac6901-09ad-43c5-90d2-ee6f1f23b268" containerName="extract-content" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.144093 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eac6901-09ad-43c5-90d2-ee6f1f23b268" containerName="extract-content" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.144198 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eac6901-09ad-43c5-90d2-ee6f1f23b268" containerName="registry-server" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.145034 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.174182 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjljp"] Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.219893 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/106d09b9-99cf-4500-88db-c6e8121491b6-utilities\") pod \"redhat-marketplace-zjljp\" (UID: \"106d09b9-99cf-4500-88db-c6e8121491b6\") " pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.219945 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xwjn\" (UniqueName: \"kubernetes.io/projected/106d09b9-99cf-4500-88db-c6e8121491b6-kube-api-access-6xwjn\") pod \"redhat-marketplace-zjljp\" (UID: \"106d09b9-99cf-4500-88db-c6e8121491b6\") " pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.220158 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/106d09b9-99cf-4500-88db-c6e8121491b6-catalog-content\") pod \"redhat-marketplace-zjljp\" (UID: \"106d09b9-99cf-4500-88db-c6e8121491b6\") " pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.321998 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/106d09b9-99cf-4500-88db-c6e8121491b6-utilities\") pod \"redhat-marketplace-zjljp\" (UID: \"106d09b9-99cf-4500-88db-c6e8121491b6\") " pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.322051 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xwjn\" (UniqueName: \"kubernetes.io/projected/106d09b9-99cf-4500-88db-c6e8121491b6-kube-api-access-6xwjn\") pod \"redhat-marketplace-zjljp\" (UID: \"106d09b9-99cf-4500-88db-c6e8121491b6\") " pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.322112 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/106d09b9-99cf-4500-88db-c6e8121491b6-catalog-content\") pod \"redhat-marketplace-zjljp\" (UID: \"106d09b9-99cf-4500-88db-c6e8121491b6\") " pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.322556 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/106d09b9-99cf-4500-88db-c6e8121491b6-utilities\") pod \"redhat-marketplace-zjljp\" (UID: \"106d09b9-99cf-4500-88db-c6e8121491b6\") " pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.322604 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/106d09b9-99cf-4500-88db-c6e8121491b6-catalog-content\") pod \"redhat-marketplace-zjljp\" (UID: \"106d09b9-99cf-4500-88db-c6e8121491b6\") " pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.352348 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xwjn\" (UniqueName: \"kubernetes.io/projected/106d09b9-99cf-4500-88db-c6e8121491b6-kube-api-access-6xwjn\") pod \"redhat-marketplace-zjljp\" (UID: \"106d09b9-99cf-4500-88db-c6e8121491b6\") " pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.463495 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:07:58 crc kubenswrapper[4717]: I0217 15:07:58.936649 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjljp"] Feb 17 15:07:59 crc kubenswrapper[4717]: I0217 15:07:59.812502 4717 generic.go:334] "Generic (PLEG): container finished" podID="106d09b9-99cf-4500-88db-c6e8121491b6" containerID="7d4e5dab5e0f52d07ea0d2b35b082437b5dab70e8747f631cffa723f1e3cb2d3" exitCode=0 Feb 17 15:07:59 crc kubenswrapper[4717]: I0217 15:07:59.812608 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjljp" event={"ID":"106d09b9-99cf-4500-88db-c6e8121491b6","Type":"ContainerDied","Data":"7d4e5dab5e0f52d07ea0d2b35b082437b5dab70e8747f631cffa723f1e3cb2d3"} Feb 17 15:07:59 crc kubenswrapper[4717]: I0217 15:07:59.812868 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjljp" event={"ID":"106d09b9-99cf-4500-88db-c6e8121491b6","Type":"ContainerStarted","Data":"929ac92c72d3a1ab10bd052521d0b0d85d6a3874004e9100f7ed79e758918067"} Feb 17 15:08:00 crc kubenswrapper[4717]: I0217 15:08:00.821709 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjljp" event={"ID":"106d09b9-99cf-4500-88db-c6e8121491b6","Type":"ContainerStarted","Data":"01d7669250069bed1a763163864e388d184e9fc39770d2c3dfbb7d2e0819cdad"} Feb 17 15:08:01 crc kubenswrapper[4717]: I0217 15:08:01.830589 4717 generic.go:334] "Generic (PLEG): container finished" podID="106d09b9-99cf-4500-88db-c6e8121491b6" containerID="01d7669250069bed1a763163864e388d184e9fc39770d2c3dfbb7d2e0819cdad" exitCode=0 Feb 17 15:08:01 crc kubenswrapper[4717]: I0217 15:08:01.830645 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjljp" event={"ID":"106d09b9-99cf-4500-88db-c6e8121491b6","Type":"ContainerDied","Data":"01d7669250069bed1a763163864e388d184e9fc39770d2c3dfbb7d2e0819cdad"} Feb 17 15:08:03 crc kubenswrapper[4717]: I0217 15:08:03.852810 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjljp" event={"ID":"106d09b9-99cf-4500-88db-c6e8121491b6","Type":"ContainerStarted","Data":"f7b4df57142a26768ba61fa1ea391f8241b8927c3a6266cda42adc0ba3286016"} Feb 17 15:08:03 crc kubenswrapper[4717]: I0217 15:08:03.871780 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zjljp" podStartSLOduration=2.936908315 podStartE2EDuration="5.871763412s" podCreationTimestamp="2026-02-17 15:07:58 +0000 UTC" firstStartedPulling="2026-02-17 15:07:59.814488249 +0000 UTC m=+946.230328725" lastFinishedPulling="2026-02-17 15:08:02.749343346 +0000 UTC m=+949.165183822" observedRunningTime="2026-02-17 15:08:03.871260128 +0000 UTC m=+950.287100624" watchObservedRunningTime="2026-02-17 15:08:03.871763412 +0000 UTC m=+950.287603898" Feb 17 15:08:08 crc kubenswrapper[4717]: I0217 15:08:08.464404 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:08:08 crc kubenswrapper[4717]: I0217 15:08:08.465302 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:08:08 crc kubenswrapper[4717]: I0217 15:08:08.548449 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:08:08 crc kubenswrapper[4717]: I0217 15:08:08.945600 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:08:08 crc kubenswrapper[4717]: I0217 15:08:08.958855 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh"] Feb 17 15:08:08 crc kubenswrapper[4717]: I0217 15:08:08.960000 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh" Feb 17 15:08:08 crc kubenswrapper[4717]: I0217 15:08:08.963705 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-c9pfv" Feb 17 15:08:08 crc kubenswrapper[4717]: I0217 15:08:08.967166 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9"] Feb 17 15:08:08 crc kubenswrapper[4717]: I0217 15:08:08.968521 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9" Feb 17 15:08:08 crc kubenswrapper[4717]: I0217 15:08:08.995010 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-6dwnp" Feb 17 15:08:08 crc kubenswrapper[4717]: I0217 15:08:08.997117 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt9pt\" (UniqueName: \"kubernetes.io/projected/cd977519-10c4-4afe-8f51-52f6cab597f9-kube-api-access-tt9pt\") pod \"barbican-operator-controller-manager-868647ff47-fp8nh\" (UID: \"cd977519-10c4-4afe-8f51-52f6cab597f9\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh" Feb 17 15:08:08 crc kubenswrapper[4717]: I0217 15:08:09.000247 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.053786 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.055010 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.058826 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-j2dq6" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.079310 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.087417 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.088623 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.101845 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-pwgzw" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.102096 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.103225 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xz28\" (UniqueName: \"kubernetes.io/projected/55e71e0a-9623-4049-b828-77040b5dd36e-kube-api-access-7xz28\") pod \"designate-operator-controller-manager-6d8bf5c495-5ddqf\" (UID: \"55e71e0a-9623-4049-b828-77040b5dd36e\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.103284 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt9pt\" (UniqueName: \"kubernetes.io/projected/cd977519-10c4-4afe-8f51-52f6cab597f9-kube-api-access-tt9pt\") pod \"barbican-operator-controller-manager-868647ff47-fp8nh\" (UID: \"cd977519-10c4-4afe-8f51-52f6cab597f9\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.103312 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47t98\" (UniqueName: \"kubernetes.io/projected/aee98c41-a3b5-43ba-b272-279e5836df0b-kube-api-access-47t98\") pod \"cinder-operator-controller-manager-5d946d989d-5rhm9\" (UID: \"aee98c41-a3b5-43ba-b272-279e5836df0b\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.103344 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgctk\" (UniqueName: \"kubernetes.io/projected/0004ea51-4233-47ad-a9d9-8e5d745a55f8-kube-api-access-qgctk\") pod \"glance-operator-controller-manager-77987464f4-qdg4j\" (UID: \"0004ea51-4233-47ad-a9d9-8e5d745a55f8\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.119748 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjljp"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.133361 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.138909 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt9pt\" (UniqueName: \"kubernetes.io/projected/cd977519-10c4-4afe-8f51-52f6cab597f9-kube-api-access-tt9pt\") pod \"barbican-operator-controller-manager-868647ff47-fp8nh\" (UID: \"cd977519-10c4-4afe-8f51-52f6cab597f9\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.146947 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-5d768"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.152066 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-5d768" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.160205 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-5d768"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.170452 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-fw5hl" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.171724 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.173168 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.179402 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.180868 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-sz78h" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.193737 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.198677 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kt6mm" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.199429 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.206167 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47t98\" (UniqueName: \"kubernetes.io/projected/aee98c41-a3b5-43ba-b272-279e5836df0b-kube-api-access-47t98\") pod \"cinder-operator-controller-manager-5d946d989d-5rhm9\" (UID: \"aee98c41-a3b5-43ba-b272-279e5836df0b\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.206236 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgctk\" (UniqueName: \"kubernetes.io/projected/0004ea51-4233-47ad-a9d9-8e5d745a55f8-kube-api-access-qgctk\") pod \"glance-operator-controller-manager-77987464f4-qdg4j\" (UID: \"0004ea51-4233-47ad-a9d9-8e5d745a55f8\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.206307 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xz28\" (UniqueName: \"kubernetes.io/projected/55e71e0a-9623-4049-b828-77040b5dd36e-kube-api-access-7xz28\") pod \"designate-operator-controller-manager-6d8bf5c495-5ddqf\" (UID: \"55e71e0a-9623-4049-b828-77040b5dd36e\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.207473 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.222529 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.229789 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgctk\" (UniqueName: \"kubernetes.io/projected/0004ea51-4233-47ad-a9d9-8e5d745a55f8-kube-api-access-qgctk\") pod \"glance-operator-controller-manager-77987464f4-qdg4j\" (UID: \"0004ea51-4233-47ad-a9d9-8e5d745a55f8\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.233632 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.234665 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.236830 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xz28\" (UniqueName: \"kubernetes.io/projected/55e71e0a-9623-4049-b828-77040b5dd36e-kube-api-access-7xz28\") pod \"designate-operator-controller-manager-6d8bf5c495-5ddqf\" (UID: \"55e71e0a-9623-4049-b828-77040b5dd36e\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.241184 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-d5szl" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.246711 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47t98\" (UniqueName: \"kubernetes.io/projected/aee98c41-a3b5-43ba-b272-279e5836df0b-kube-api-access-47t98\") pod \"cinder-operator-controller-manager-5d946d989d-5rhm9\" (UID: \"aee98c41-a3b5-43ba-b272-279e5836df0b\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.273419 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.292243 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.293262 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.303533 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-x6s4j" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.304020 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.304246 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.305528 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.308174 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cj2d\" (UniqueName: \"kubernetes.io/projected/9aa57429-09b8-4262-8356-fd8ea486b236-kube-api-access-7cj2d\") pod \"infra-operator-controller-manager-79d975b745-d6ssr\" (UID: \"9aa57429-09b8-4262-8356-fd8ea486b236\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.308215 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl84s\" (UniqueName: \"kubernetes.io/projected/c12ec16a-c8fd-48ae-8c86-257bcef97050-kube-api-access-kl84s\") pod \"heat-operator-controller-manager-69f49c598c-5d768\" (UID: \"c12ec16a-c8fd-48ae-8c86-257bcef97050\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-5d768" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.308300 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw72k\" (UniqueName: \"kubernetes.io/projected/9bc8e53b-549b-48d9-810f-25ce640b7339-kube-api-access-kw72k\") pod \"horizon-operator-controller-manager-5b9b8895d5-9kp5d\" (UID: \"9bc8e53b-549b-48d9-810f-25ce640b7339\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.308324 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert\") pod \"infra-operator-controller-manager-79d975b745-d6ssr\" (UID: \"9aa57429-09b8-4262-8356-fd8ea486b236\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.310141 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.312618 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-v9x94" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.335750 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.344153 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.345224 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.374553 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-844s8" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.380470 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.387152 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.400671 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.401427 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.405630 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.410529 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert\") pod \"infra-operator-controller-manager-79d975b745-d6ssr\" (UID: \"9aa57429-09b8-4262-8356-fd8ea486b236\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.410596 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vknvt\" (UniqueName: \"kubernetes.io/projected/22accf0b-8a4d-478a-bc45-d5bd4aa45b87-kube-api-access-vknvt\") pod \"neutron-operator-controller-manager-64ddbf8bb-gmfn2\" (UID: \"22accf0b-8a4d-478a-bc45-d5bd4aa45b87\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.410632 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cj2d\" (UniqueName: \"kubernetes.io/projected/9aa57429-09b8-4262-8356-fd8ea486b236-kube-api-access-7cj2d\") pod \"infra-operator-controller-manager-79d975b745-d6ssr\" (UID: \"9aa57429-09b8-4262-8356-fd8ea486b236\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.410653 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl84s\" (UniqueName: \"kubernetes.io/projected/c12ec16a-c8fd-48ae-8c86-257bcef97050-kube-api-access-kl84s\") pod \"heat-operator-controller-manager-69f49c598c-5d768\" (UID: \"c12ec16a-c8fd-48ae-8c86-257bcef97050\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-5d768" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.410691 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l4nm\" (UniqueName: \"kubernetes.io/projected/563dc2c2-73a1-485a-ab9e-6f7e0b3423cb-kube-api-access-4l4nm\") pod \"mariadb-operator-controller-manager-6994f66f48-s6xjq\" (UID: \"563dc2c2-73a1-485a-ab9e-6f7e0b3423cb\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.410728 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7knwr\" (UniqueName: \"kubernetes.io/projected/5390fc8f-fcef-4a64-8d3c-4ba8e9b2f86d-kube-api-access-7knwr\") pod \"ironic-operator-controller-manager-554564d7fc-lcth2\" (UID: \"5390fc8f-fcef-4a64-8d3c-4ba8e9b2f86d\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.410772 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndm7g\" (UniqueName: \"kubernetes.io/projected/514fed4f-53b1-4b52-8b25-7e4ec648e155-kube-api-access-ndm7g\") pod \"manila-operator-controller-manager-54f6768c69-dtxn5\" (UID: \"514fed4f-53b1-4b52-8b25-7e4ec648e155\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.410796 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk58d\" (UniqueName: \"kubernetes.io/projected/6edcc364-656f-4f2d-aa9a-3409b3b58471-kube-api-access-hk58d\") pod \"keystone-operator-controller-manager-b4d948c87-d7sx6\" (UID: \"6edcc364-656f-4f2d-aa9a-3409b3b58471\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.410823 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw72k\" (UniqueName: \"kubernetes.io/projected/9bc8e53b-549b-48d9-810f-25ce640b7339-kube-api-access-kw72k\") pod \"horizon-operator-controller-manager-5b9b8895d5-9kp5d\" (UID: \"9bc8e53b-549b-48d9-810f-25ce640b7339\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d" Feb 17 15:08:09 crc kubenswrapper[4717]: E0217 15:08:09.411287 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 15:08:09 crc kubenswrapper[4717]: E0217 15:08:09.411341 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert podName:9aa57429-09b8-4262-8356-fd8ea486b236 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:09.91132242 +0000 UTC m=+956.327162896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert") pod "infra-operator-controller-manager-79d975b745-d6ssr" (UID: "9aa57429-09b8-4262-8356-fd8ea486b236") : secret "infra-operator-webhook-server-cert" not found Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.416508 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vljb6" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.420598 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.422031 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.440059 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.453737 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.454925 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.459903 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw72k\" (UniqueName: \"kubernetes.io/projected/9bc8e53b-549b-48d9-810f-25ce640b7339-kube-api-access-kw72k\") pod \"horizon-operator-controller-manager-5b9b8895d5-9kp5d\" (UID: \"9bc8e53b-549b-48d9-810f-25ce640b7339\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.460576 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-s8ztw" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.461115 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl84s\" (UniqueName: \"kubernetes.io/projected/c12ec16a-c8fd-48ae-8c86-257bcef97050-kube-api-access-kl84s\") pod \"heat-operator-controller-manager-69f49c598c-5d768\" (UID: \"c12ec16a-c8fd-48ae-8c86-257bcef97050\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-5d768" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.467153 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cj2d\" (UniqueName: \"kubernetes.io/projected/9aa57429-09b8-4262-8356-fd8ea486b236-kube-api-access-7cj2d\") pod \"infra-operator-controller-manager-79d975b745-d6ssr\" (UID: \"9aa57429-09b8-4262-8356-fd8ea486b236\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.467984 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.468114 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.469424 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.506159 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-87ngw" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.522552 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-5d768" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.534365 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l4nm\" (UniqueName: \"kubernetes.io/projected/563dc2c2-73a1-485a-ab9e-6f7e0b3423cb-kube-api-access-4l4nm\") pod \"mariadb-operator-controller-manager-6994f66f48-s6xjq\" (UID: \"563dc2c2-73a1-485a-ab9e-6f7e0b3423cb\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.604643 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7knwr\" (UniqueName: \"kubernetes.io/projected/5390fc8f-fcef-4a64-8d3c-4ba8e9b2f86d-kube-api-access-7knwr\") pod \"ironic-operator-controller-manager-554564d7fc-lcth2\" (UID: \"5390fc8f-fcef-4a64-8d3c-4ba8e9b2f86d\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.604841 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndm7g\" (UniqueName: \"kubernetes.io/projected/514fed4f-53b1-4b52-8b25-7e4ec648e155-kube-api-access-ndm7g\") pod \"manila-operator-controller-manager-54f6768c69-dtxn5\" (UID: \"514fed4f-53b1-4b52-8b25-7e4ec648e155\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.604942 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk58d\" (UniqueName: \"kubernetes.io/projected/6edcc364-656f-4f2d-aa9a-3409b3b58471-kube-api-access-hk58d\") pod \"keystone-operator-controller-manager-b4d948c87-d7sx6\" (UID: \"6edcc364-656f-4f2d-aa9a-3409b3b58471\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.605146 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vknvt\" (UniqueName: \"kubernetes.io/projected/22accf0b-8a4d-478a-bc45-d5bd4aa45b87-kube-api-access-vknvt\") pod \"neutron-operator-controller-manager-64ddbf8bb-gmfn2\" (UID: \"22accf0b-8a4d-478a-bc45-d5bd4aa45b87\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.573683 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.596256 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l4nm\" (UniqueName: \"kubernetes.io/projected/563dc2c2-73a1-485a-ab9e-6f7e0b3423cb-kube-api-access-4l4nm\") pod \"mariadb-operator-controller-manager-6994f66f48-s6xjq\" (UID: \"563dc2c2-73a1-485a-ab9e-6f7e0b3423cb\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.601341 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.638547 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.644138 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.645284 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.650264 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.651697 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-ph47k" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.651967 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.652591 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.653058 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-k72df" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.664280 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndm7g\" (UniqueName: \"kubernetes.io/projected/514fed4f-53b1-4b52-8b25-7e4ec648e155-kube-api-access-ndm7g\") pod \"manila-operator-controller-manager-54f6768c69-dtxn5\" (UID: \"514fed4f-53b1-4b52-8b25-7e4ec648e155\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.673157 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.684905 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vknvt\" (UniqueName: \"kubernetes.io/projected/22accf0b-8a4d-478a-bc45-d5bd4aa45b87-kube-api-access-vknvt\") pod \"neutron-operator-controller-manager-64ddbf8bb-gmfn2\" (UID: \"22accf0b-8a4d-478a-bc45-d5bd4aa45b87\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.693626 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.694874 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.695245 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.703324 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk58d\" (UniqueName: \"kubernetes.io/projected/6edcc364-656f-4f2d-aa9a-3409b3b58471-kube-api-access-hk58d\") pod \"keystone-operator-controller-manager-b4d948c87-d7sx6\" (UID: \"6edcc364-656f-4f2d-aa9a-3409b3b58471\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.703753 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-pg648" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.703899 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-lvx45" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.705363 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.706262 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cm8796\" (UID: \"b4295f2b-c9b6-4604-b910-525c07cca2ed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.706290 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4kc2\" (UniqueName: \"kubernetes.io/projected/8a4392b6-6232-4135-91f9-676c565446fc-kube-api-access-k4kc2\") pod \"ovn-operator-controller-manager-d44cf6b75-x6nmz\" (UID: \"8a4392b6-6232-4135-91f9-676c565446fc\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.706347 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrqd7\" (UniqueName: \"kubernetes.io/projected/b4295f2b-c9b6-4604-b910-525c07cca2ed-kube-api-access-mrqd7\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cm8796\" (UID: \"b4295f2b-c9b6-4604-b910-525c07cca2ed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.706374 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6nnc\" (UniqueName: \"kubernetes.io/projected/3a97433f-8ce2-446e-92ef-170a4996ffe8-kube-api-access-k6nnc\") pod \"nova-operator-controller-manager-567668f5cf-22tfq\" (UID: \"3a97433f-8ce2-446e-92ef-170a4996ffe8\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.706439 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrfqh\" (UniqueName: \"kubernetes.io/projected/3dcfb2d1-e765-4eb0-8300-f7567a34cae7-kube-api-access-qrfqh\") pod \"octavia-operator-controller-manager-69f8888797-mspl4\" (UID: \"3dcfb2d1-e765-4eb0-8300-f7567a34cae7\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.716699 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7knwr\" (UniqueName: \"kubernetes.io/projected/5390fc8f-fcef-4a64-8d3c-4ba8e9b2f86d-kube-api-access-7knwr\") pod \"ironic-operator-controller-manager-554564d7fc-lcth2\" (UID: \"5390fc8f-fcef-4a64-8d3c-4ba8e9b2f86d\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.719141 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.727467 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.741618 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.742132 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.743427 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.745823 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-t644t" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.765040 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.775273 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.793464 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-zlkwq"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.799685 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.808777 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrfqh\" (UniqueName: \"kubernetes.io/projected/3dcfb2d1-e765-4eb0-8300-f7567a34cae7-kube-api-access-qrfqh\") pod \"octavia-operator-controller-manager-69f8888797-mspl4\" (UID: \"3dcfb2d1-e765-4eb0-8300-f7567a34cae7\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.808833 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d266r\" (UniqueName: \"kubernetes.io/projected/2872debe-d42d-4955-bcca-5006aa7a2ecc-kube-api-access-d266r\") pod \"swift-operator-controller-manager-68f46476f-j9tsj\" (UID: \"2872debe-d42d-4955-bcca-5006aa7a2ecc\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.808861 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cm8796\" (UID: \"b4295f2b-c9b6-4604-b910-525c07cca2ed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.808885 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4kc2\" (UniqueName: \"kubernetes.io/projected/8a4392b6-6232-4135-91f9-676c565446fc-kube-api-access-k4kc2\") pod \"ovn-operator-controller-manager-d44cf6b75-x6nmz\" (UID: \"8a4392b6-6232-4135-91f9-676c565446fc\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.812512 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-7jjkt" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.813470 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvqgk\" (UniqueName: \"kubernetes.io/projected/ae48b931-1ac6-43ed-a407-b3fbb3d56178-kube-api-access-cvqgk\") pod \"placement-operator-controller-manager-8497b45c89-hzzpc\" (UID: \"ae48b931-1ac6-43ed-a407-b3fbb3d56178\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.813616 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrqd7\" (UniqueName: \"kubernetes.io/projected/b4295f2b-c9b6-4604-b910-525c07cca2ed-kube-api-access-mrqd7\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cm8796\" (UID: \"b4295f2b-c9b6-4604-b910-525c07cca2ed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.813872 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6nnc\" (UniqueName: \"kubernetes.io/projected/3a97433f-8ce2-446e-92ef-170a4996ffe8-kube-api-access-k6nnc\") pod \"nova-operator-controller-manager-567668f5cf-22tfq\" (UID: \"3a97433f-8ce2-446e-92ef-170a4996ffe8\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.813977 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln5qz\" (UniqueName: \"kubernetes.io/projected/af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb-kube-api-access-ln5qz\") pod \"telemetry-operator-controller-manager-7f45b4ff68-spbft\" (UID: \"af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft" Feb 17 15:08:09 crc kubenswrapper[4717]: E0217 15:08:09.826272 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 15:08:09 crc kubenswrapper[4717]: E0217 15:08:09.826350 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert podName:b4295f2b-c9b6-4604-b910-525c07cca2ed nodeName:}" failed. No retries permitted until 2026-02-17 15:08:10.326328119 +0000 UTC m=+956.742168595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" (UID: "b4295f2b-c9b6-4604-b910-525c07cca2ed") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.833063 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.843638 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4kc2\" (UniqueName: \"kubernetes.io/projected/8a4392b6-6232-4135-91f9-676c565446fc-kube-api-access-k4kc2\") pod \"ovn-operator-controller-manager-d44cf6b75-x6nmz\" (UID: \"8a4392b6-6232-4135-91f9-676c565446fc\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.847048 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrfqh\" (UniqueName: \"kubernetes.io/projected/3dcfb2d1-e765-4eb0-8300-f7567a34cae7-kube-api-access-qrfqh\") pod \"octavia-operator-controller-manager-69f8888797-mspl4\" (UID: \"3dcfb2d1-e765-4eb0-8300-f7567a34cae7\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.859111 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6nnc\" (UniqueName: \"kubernetes.io/projected/3a97433f-8ce2-446e-92ef-170a4996ffe8-kube-api-access-k6nnc\") pod \"nova-operator-controller-manager-567668f5cf-22tfq\" (UID: \"3a97433f-8ce2-446e-92ef-170a4996ffe8\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.859970 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.869949 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrqd7\" (UniqueName: \"kubernetes.io/projected/b4295f2b-c9b6-4604-b910-525c07cca2ed-kube-api-access-mrqd7\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cm8796\" (UID: \"b4295f2b-c9b6-4604-b910-525c07cca2ed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.889909 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-zlkwq"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.896519 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.897788 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.898428 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.907064 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-hdgs9" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.907644 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.915705 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln5qz\" (UniqueName: \"kubernetes.io/projected/af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb-kube-api-access-ln5qz\") pod \"telemetry-operator-controller-manager-7f45b4ff68-spbft\" (UID: \"af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.915795 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d266r\" (UniqueName: \"kubernetes.io/projected/2872debe-d42d-4955-bcca-5006aa7a2ecc-kube-api-access-d266r\") pod \"swift-operator-controller-manager-68f46476f-j9tsj\" (UID: \"2872debe-d42d-4955-bcca-5006aa7a2ecc\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.915846 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvps6\" (UniqueName: \"kubernetes.io/projected/aaf337fd-63ab-42de-a465-02fecc40116b-kube-api-access-xvps6\") pod \"test-operator-controller-manager-7866795846-zlkwq\" (UID: \"aaf337fd-63ab-42de-a465-02fecc40116b\") " pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.915865 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert\") pod \"infra-operator-controller-manager-79d975b745-d6ssr\" (UID: \"9aa57429-09b8-4262-8356-fd8ea486b236\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.915889 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvqgk\" (UniqueName: \"kubernetes.io/projected/ae48b931-1ac6-43ed-a407-b3fbb3d56178-kube-api-access-cvqgk\") pod \"placement-operator-controller-manager-8497b45c89-hzzpc\" (UID: \"ae48b931-1ac6-43ed-a407-b3fbb3d56178\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" Feb 17 15:08:09 crc kubenswrapper[4717]: E0217 15:08:09.917607 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 15:08:09 crc kubenswrapper[4717]: E0217 15:08:09.917659 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert podName:9aa57429-09b8-4262-8356-fd8ea486b236 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:10.917639627 +0000 UTC m=+957.333480313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert") pod "infra-operator-controller-manager-79d975b745-d6ssr" (UID: "9aa57429-09b8-4262-8356-fd8ea486b236") : secret "infra-operator-webhook-server-cert" not found Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.937218 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.938146 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.947303 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.947475 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.947785 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wsqnw" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.948698 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvqgk\" (UniqueName: \"kubernetes.io/projected/ae48b931-1ac6-43ed-a407-b3fbb3d56178-kube-api-access-cvqgk\") pod \"placement-operator-controller-manager-8497b45c89-hzzpc\" (UID: \"ae48b931-1ac6-43ed-a407-b3fbb3d56178\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.948884 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d266r\" (UniqueName: \"kubernetes.io/projected/2872debe-d42d-4955-bcca-5006aa7a2ecc-kube-api-access-d266r\") pod \"swift-operator-controller-manager-68f46476f-j9tsj\" (UID: \"2872debe-d42d-4955-bcca-5006aa7a2ecc\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.954826 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.961738 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln5qz\" (UniqueName: \"kubernetes.io/projected/af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb-kube-api-access-ln5qz\") pod \"telemetry-operator-controller-manager-7f45b4ff68-spbft\" (UID: \"af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.967426 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz" Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.998408 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d"] Feb 17 15:08:09 crc kubenswrapper[4717]: I0217 15:08:09.999543 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.012539 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.012635 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-k9r6f" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.023162 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.024245 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.024423 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvps6\" (UniqueName: \"kubernetes.io/projected/aaf337fd-63ab-42de-a465-02fecc40116b-kube-api-access-xvps6\") pod \"test-operator-controller-manager-7866795846-zlkwq\" (UID: \"aaf337fd-63ab-42de-a465-02fecc40116b\") " pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.024570 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-554fz\" (UniqueName: \"kubernetes.io/projected/5fdc2829-32d7-456a-99f3-e15b957b272e-kube-api-access-554fz\") pod \"watcher-operator-controller-manager-5db88f68c-jq8zs\" (UID: \"5fdc2829-32d7-456a-99f3-e15b957b272e\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.024929 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2phz\" (UniqueName: \"kubernetes.io/projected/f2279bdf-746c-4e8c-8703-74a256bd7923-kube-api-access-h2phz\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.026951 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d"] Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.055872 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.096070 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.096991 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.097140 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvps6\" (UniqueName: \"kubernetes.io/projected/aaf337fd-63ab-42de-a465-02fecc40116b-kube-api-access-xvps6\") pod \"test-operator-controller-manager-7866795846-zlkwq\" (UID: \"aaf337fd-63ab-42de-a465-02fecc40116b\") " pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.131353 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.133596 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.133670 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.133836 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc8s4\" (UniqueName: \"kubernetes.io/projected/dd3497e6-6de6-4bdf-a23e-16adc21de6ab-kube-api-access-gc8s4\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2w74d\" (UID: \"dd3497e6-6de6-4bdf-a23e-16adc21de6ab\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.133866 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-554fz\" (UniqueName: \"kubernetes.io/projected/5fdc2829-32d7-456a-99f3-e15b957b272e-kube-api-access-554fz\") pod \"watcher-operator-controller-manager-5db88f68c-jq8zs\" (UID: \"5fdc2829-32d7-456a-99f3-e15b957b272e\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.133925 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2phz\" (UniqueName: \"kubernetes.io/projected/f2279bdf-746c-4e8c-8703-74a256bd7923-kube-api-access-h2phz\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.134981 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.135029 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs podName:f2279bdf-746c-4e8c-8703-74a256bd7923 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:10.635015182 +0000 UTC m=+957.050855658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs") pod "openstack-operator-controller-manager-7d87bc949d-9mr4h" (UID: "f2279bdf-746c-4e8c-8703-74a256bd7923") : secret "webhook-server-cert" not found Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.135245 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.135298 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs podName:f2279bdf-746c-4e8c-8703-74a256bd7923 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:10.63528824 +0000 UTC m=+957.051128716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs") pod "openstack-operator-controller-manager-7d87bc949d-9mr4h" (UID: "f2279bdf-746c-4e8c-8703-74a256bd7923") : secret "metrics-server-cert" not found Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.152563 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2phz\" (UniqueName: \"kubernetes.io/projected/f2279bdf-746c-4e8c-8703-74a256bd7923-kube-api-access-h2phz\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.160804 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-554fz\" (UniqueName: \"kubernetes.io/projected/5fdc2829-32d7-456a-99f3-e15b957b272e-kube-api-access-554fz\") pod \"watcher-operator-controller-manager-5db88f68c-jq8zs\" (UID: \"5fdc2829-32d7-456a-99f3-e15b957b272e\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.236175 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc8s4\" (UniqueName: \"kubernetes.io/projected/dd3497e6-6de6-4bdf-a23e-16adc21de6ab-kube-api-access-gc8s4\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2w74d\" (UID: \"dd3497e6-6de6-4bdf-a23e-16adc21de6ab\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.265346 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc8s4\" (UniqueName: \"kubernetes.io/projected/dd3497e6-6de6-4bdf-a23e-16adc21de6ab-kube-api-access-gc8s4\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2w74d\" (UID: \"dd3497e6-6de6-4bdf-a23e-16adc21de6ab\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.338033 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cm8796\" (UID: \"b4295f2b-c9b6-4604-b910-525c07cca2ed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.338274 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.338334 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert podName:b4295f2b-c9b6-4604-b910-525c07cca2ed nodeName:}" failed. No retries permitted until 2026-02-17 15:08:11.338316827 +0000 UTC m=+957.754157303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" (UID: "b4295f2b-c9b6-4604-b910-525c07cca2ed") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.340687 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.379268 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf"] Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.392420 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9"] Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.411172 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh"] Feb 17 15:08:10 crc kubenswrapper[4717]: W0217 15:08:10.445538 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee98c41_a3b5_43ba_b272_279e5836df0b.slice/crio-48af4b3c348e315bc4711c53654a005ebc47e5a893ae9cb797710afac98654b9 WatchSource:0}: Error finding container 48af4b3c348e315bc4711c53654a005ebc47e5a893ae9cb797710afac98654b9: Status 404 returned error can't find the container with id 48af4b3c348e315bc4711c53654a005ebc47e5a893ae9cb797710afac98654b9 Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.446494 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d" Feb 17 15:08:10 crc kubenswrapper[4717]: W0217 15:08:10.452379 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55e71e0a_9623_4049_b828_77040b5dd36e.slice/crio-b2e6d5ea35377f3a059f4cb82db88e03659b5c8f4e8eb117e038bc2e3b0bf959 WatchSource:0}: Error finding container b2e6d5ea35377f3a059f4cb82db88e03659b5c8f4e8eb117e038bc2e3b0bf959: Status 404 returned error can't find the container with id b2e6d5ea35377f3a059f4cb82db88e03659b5c8f4e8eb117e038bc2e3b0bf959 Feb 17 15:08:10 crc kubenswrapper[4717]: W0217 15:08:10.467703 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd977519_10c4_4afe_8f51_52f6cab597f9.slice/crio-71126cf72b90bf4c985fe9dfc0630ef15daa88148e27c79a09d3c08208fcbe7a WatchSource:0}: Error finding container 71126cf72b90bf4c985fe9dfc0630ef15daa88148e27c79a09d3c08208fcbe7a: Status 404 returned error can't find the container with id 71126cf72b90bf4c985fe9dfc0630ef15daa88148e27c79a09d3c08208fcbe7a Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.563437 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j"] Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.644787 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.644946 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.645002 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs podName:f2279bdf-746c-4e8c-8703-74a256bd7923 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:11.644985641 +0000 UTC m=+958.060826107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs") pod "openstack-operator-controller-manager-7d87bc949d-9mr4h" (UID: "f2279bdf-746c-4e8c-8703-74a256bd7923") : secret "webhook-server-cert" not found Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.645016 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.645496 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.645555 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs podName:f2279bdf-746c-4e8c-8703-74a256bd7923 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:11.645546417 +0000 UTC m=+958.061386893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs") pod "openstack-operator-controller-manager-7d87bc949d-9mr4h" (UID: "f2279bdf-746c-4e8c-8703-74a256bd7923") : secret "metrics-server-cert" not found Feb 17 15:08:10 crc kubenswrapper[4717]: W0217 15:08:10.673462 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5390fc8f_fcef_4a64_8d3c_4ba8e9b2f86d.slice/crio-59c7e3e578252b3c6c9e54e59270b972747308af74fe94534513f67b8c20b067 WatchSource:0}: Error finding container 59c7e3e578252b3c6c9e54e59270b972747308af74fe94534513f67b8c20b067: Status 404 returned error can't find the container with id 59c7e3e578252b3c6c9e54e59270b972747308af74fe94534513f67b8c20b067 Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.675458 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d"] Feb 17 15:08:10 crc kubenswrapper[4717]: W0217 15:08:10.679193 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bc8e53b_549b_48d9_810f_25ce640b7339.slice/crio-9c4b37ab5ebffdb75d8847532327a971008e82dd35110656f819493752fc9346 WatchSource:0}: Error finding container 9c4b37ab5ebffdb75d8847532327a971008e82dd35110656f819493752fc9346: Status 404 returned error can't find the container with id 9c4b37ab5ebffdb75d8847532327a971008e82dd35110656f819493752fc9346 Feb 17 15:08:10 crc kubenswrapper[4717]: W0217 15:08:10.679724 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc12ec16a_c8fd_48ae_8c86_257bcef97050.slice/crio-ecd8093108ed477e7d153edebee9c800d7cfb2142ea4a1b6bf1d086114230438 WatchSource:0}: Error finding container ecd8093108ed477e7d153edebee9c800d7cfb2142ea4a1b6bf1d086114230438: Status 404 returned error can't find the container with id ecd8093108ed477e7d153edebee9c800d7cfb2142ea4a1b6bf1d086114230438 Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.683814 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-5d768"] Feb 17 15:08:10 crc kubenswrapper[4717]: W0217 15:08:10.689863 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod563dc2c2_73a1_485a_ab9e_6f7e0b3423cb.slice/crio-07f6378d38e49745e1cde9770dea650e954ef07769f3b277348df3c882fa432c WatchSource:0}: Error finding container 07f6378d38e49745e1cde9770dea650e954ef07769f3b277348df3c882fa432c: Status 404 returned error can't find the container with id 07f6378d38e49745e1cde9770dea650e954ef07769f3b277348df3c882fa432c Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.690745 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2"] Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.697169 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq"] Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.900948 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2"] Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.908508 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4"] Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.915610 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6"] Feb 17 15:08:10 crc kubenswrapper[4717]: W0217 15:08:10.921074 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a4392b6_6232_4135_91f9_676c565446fc.slice/crio-c2080a77f929dc59736073297ac74bc98e9ed6a15a2642277f6a4e8e7b262c94 WatchSource:0}: Error finding container c2080a77f929dc59736073297ac74bc98e9ed6a15a2642277f6a4e8e7b262c94: Status 404 returned error can't find the container with id c2080a77f929dc59736073297ac74bc98e9ed6a15a2642277f6a4e8e7b262c94 Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.921882 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5"] Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.928416 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz"] Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.931786 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2" event={"ID":"22accf0b-8a4d-478a-bc45-d5bd4aa45b87","Type":"ContainerStarted","Data":"b5d1f05909bdd1d01c6f409cc59b263f2f15262ff4353de9e2ceac87794cd02a"} Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.932132 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ndm7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-dtxn5_openstack-operators(514fed4f-53b1-4b52-8b25-7e4ec648e155): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.933035 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-5d768" event={"ID":"c12ec16a-c8fd-48ae-8c86-257bcef97050","Type":"ContainerStarted","Data":"ecd8093108ed477e7d153edebee9c800d7cfb2142ea4a1b6bf1d086114230438"} Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.933327 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" podUID="514fed4f-53b1-4b52-8b25-7e4ec648e155" Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.935378 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d" event={"ID":"9bc8e53b-549b-48d9-810f-25ce640b7339","Type":"ContainerStarted","Data":"9c4b37ab5ebffdb75d8847532327a971008e82dd35110656f819493752fc9346"} Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.937872 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2" event={"ID":"5390fc8f-fcef-4a64-8d3c-4ba8e9b2f86d","Type":"ContainerStarted","Data":"59c7e3e578252b3c6c9e54e59270b972747308af74fe94534513f67b8c20b067"} Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.949099 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4" event={"ID":"3dcfb2d1-e765-4eb0-8300-f7567a34cae7","Type":"ContainerStarted","Data":"f249d6c5b1b47d95835de346fc3bed1a46ce3ad10d8e516ec614b81895a39881"} Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.950993 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert\") pod \"infra-operator-controller-manager-79d975b745-d6ssr\" (UID: \"9aa57429-09b8-4262-8356-fd8ea486b236\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.951181 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 15:08:10 crc kubenswrapper[4717]: E0217 15:08:10.951234 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert podName:9aa57429-09b8-4262-8356-fd8ea486b236 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:12.951217935 +0000 UTC m=+959.367058411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert") pod "infra-operator-controller-manager-79d975b745-d6ssr" (UID: "9aa57429-09b8-4262-8356-fd8ea486b236") : secret "infra-operator-webhook-server-cert" not found Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.951517 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9" event={"ID":"aee98c41-a3b5-43ba-b272-279e5836df0b","Type":"ContainerStarted","Data":"48af4b3c348e315bc4711c53654a005ebc47e5a893ae9cb797710afac98654b9"} Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.954575 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq" event={"ID":"563dc2c2-73a1-485a-ab9e-6f7e0b3423cb","Type":"ContainerStarted","Data":"07f6378d38e49745e1cde9770dea650e954ef07769f3b277348df3c882fa432c"} Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.956265 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf" event={"ID":"55e71e0a-9623-4049-b828-77040b5dd36e","Type":"ContainerStarted","Data":"b2e6d5ea35377f3a059f4cb82db88e03659b5c8f4e8eb117e038bc2e3b0bf959"} Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.958314 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j" event={"ID":"0004ea51-4233-47ad-a9d9-8e5d745a55f8","Type":"ContainerStarted","Data":"93ea0b5d1df19ab29688fc6ec12718c3ea702437c599eb691918c3d3b79e1d8c"} Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.960814 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zjljp" podUID="106d09b9-99cf-4500-88db-c6e8121491b6" containerName="registry-server" containerID="cri-o://f7b4df57142a26768ba61fa1ea391f8241b8927c3a6266cda42adc0ba3286016" gracePeriod=2 Feb 17 15:08:10 crc kubenswrapper[4717]: I0217 15:08:10.961238 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh" event={"ID":"cd977519-10c4-4afe-8f51-52f6cab597f9","Type":"ContainerStarted","Data":"71126cf72b90bf4c985fe9dfc0630ef15daa88148e27c79a09d3c08208fcbe7a"} Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.054805 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft"] Feb 17 15:08:11 crc kubenswrapper[4717]: W0217 15:08:11.069072 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaf337fd_63ab_42de_a465_02fecc40116b.slice/crio-a27d6227f154f624123d7d35f7327ec7e4e7bbb6c725aa61fc2c4f7245787909 WatchSource:0}: Error finding container a27d6227f154f624123d7d35f7327ec7e4e7bbb6c725aa61fc2c4f7245787909: Status 404 returned error can't find the container with id a27d6227f154f624123d7d35f7327ec7e4e7bbb6c725aa61fc2c4f7245787909 Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.070098 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc"] Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.075879 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-zlkwq"] Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.078565 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-554fz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-jq8zs_openstack-operators(5fdc2829-32d7-456a-99f3-e15b957b272e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.078975 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cvqgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-hzzpc_openstack-operators(ae48b931-1ac6-43ed-a407-b3fbb3d56178): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 15:08:11 crc kubenswrapper[4717]: W0217 15:08:11.079661 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a97433f_8ce2_446e_92ef_170a4996ffe8.slice/crio-bbfb8a41a974afe0bb3a621c5c58b7c758a180a5645e2d9a332be6cea9bd5740 WatchSource:0}: Error finding container bbfb8a41a974afe0bb3a621c5c58b7c758a180a5645e2d9a332be6cea9bd5740: Status 404 returned error can't find the container with id bbfb8a41a974afe0bb3a621c5c58b7c758a180a5645e2d9a332be6cea9bd5740 Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.079661 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" podUID="5fdc2829-32d7-456a-99f3-e15b957b272e" Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.079969 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvps6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-zlkwq_openstack-operators(aaf337fd-63ab-42de-a465-02fecc40116b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 15:08:11 crc kubenswrapper[4717]: W0217 15:08:11.080041 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2872debe_d42d_4955_bcca_5006aa7a2ecc.slice/crio-c15e1d80e9100a0422bf7f0c10ed5290dba3a5a6a0a91ac63b163223c1e6867d WatchSource:0}: Error finding container c15e1d80e9100a0422bf7f0c10ed5290dba3a5a6a0a91ac63b163223c1e6867d: Status 404 returned error can't find the container with id c15e1d80e9100a0422bf7f0c10ed5290dba3a5a6a0a91ac63b163223c1e6867d Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.080072 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" podUID="ae48b931-1ac6-43ed-a407-b3fbb3d56178" Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.081068 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" podUID="aaf337fd-63ab-42de-a465-02fecc40116b" Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.081347 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k6nnc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-22tfq_openstack-operators(3a97433f-8ce2-446e-92ef-170a4996ffe8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.082464 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" podUID="3a97433f-8ce2-446e-92ef-170a4996ffe8" Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.082645 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d266r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-j9tsj_openstack-operators(2872debe-d42d-4955-bcca-5006aa7a2ecc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.082690 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs"] Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.083722 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" podUID="2872debe-d42d-4955-bcca-5006aa7a2ecc" Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.092134 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq"] Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.097291 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj"] Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.210860 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d"] Feb 17 15:08:11 crc kubenswrapper[4717]: W0217 15:08:11.290970 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd3497e6_6de6_4bdf_a23e_16adc21de6ab.slice/crio-9a5cc601508c5c2ef3aac9731c84e245668124763746df86d9cdd709964e57d3 WatchSource:0}: Error finding container 9a5cc601508c5c2ef3aac9731c84e245668124763746df86d9cdd709964e57d3: Status 404 returned error can't find the container with id 9a5cc601508c5c2ef3aac9731c84e245668124763746df86d9cdd709964e57d3 Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.357411 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cm8796\" (UID: \"b4295f2b-c9b6-4604-b910-525c07cca2ed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.357579 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.357641 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert podName:b4295f2b-c9b6-4604-b910-525c07cca2ed nodeName:}" failed. No retries permitted until 2026-02-17 15:08:13.357626478 +0000 UTC m=+959.773466944 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" (UID: "b4295f2b-c9b6-4604-b910-525c07cca2ed") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.662141 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.662383 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.662524 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.662578 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs podName:f2279bdf-746c-4e8c-8703-74a256bd7923 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:13.662563735 +0000 UTC m=+960.078404211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs") pod "openstack-operator-controller-manager-7d87bc949d-9mr4h" (UID: "f2279bdf-746c-4e8c-8703-74a256bd7923") : secret "webhook-server-cert" not found Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.663527 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.663577 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs podName:f2279bdf-746c-4e8c-8703-74a256bd7923 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:13.663564303 +0000 UTC m=+960.079404779 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs") pod "openstack-operator-controller-manager-7d87bc949d-9mr4h" (UID: "f2279bdf-746c-4e8c-8703-74a256bd7923") : secret "metrics-server-cert" not found Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.972868 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" event={"ID":"3a97433f-8ce2-446e-92ef-170a4996ffe8","Type":"ContainerStarted","Data":"bbfb8a41a974afe0bb3a621c5c58b7c758a180a5645e2d9a332be6cea9bd5740"} Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.974747 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" event={"ID":"2872debe-d42d-4955-bcca-5006aa7a2ecc","Type":"ContainerStarted","Data":"c15e1d80e9100a0422bf7f0c10ed5290dba3a5a6a0a91ac63b163223c1e6867d"} Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.977322 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" podUID="3a97433f-8ce2-446e-92ef-170a4996ffe8" Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.977952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" event={"ID":"514fed4f-53b1-4b52-8b25-7e4ec648e155","Type":"ContainerStarted","Data":"b568d08e40832233595935e391de89417ecaa4002b86e4e81da8e290ab593c9e"} Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.978051 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" podUID="2872debe-d42d-4955-bcca-5006aa7a2ecc" Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.982283 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" podUID="514fed4f-53b1-4b52-8b25-7e4ec648e155" Feb 17 15:08:11 crc kubenswrapper[4717]: I0217 15:08:11.982685 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" event={"ID":"5fdc2829-32d7-456a-99f3-e15b957b272e","Type":"ContainerStarted","Data":"6d77c8e0ab59fd8b6ecdfa5be48f287c1160d93143fcae771b45b22ba7e504ce"} Feb 17 15:08:11 crc kubenswrapper[4717]: E0217 15:08:11.987516 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" podUID="5fdc2829-32d7-456a-99f3-e15b957b272e" Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.004207 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz" event={"ID":"8a4392b6-6232-4135-91f9-676c565446fc","Type":"ContainerStarted","Data":"c2080a77f929dc59736073297ac74bc98e9ed6a15a2642277f6a4e8e7b262c94"} Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.008904 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" event={"ID":"aaf337fd-63ab-42de-a465-02fecc40116b","Type":"ContainerStarted","Data":"a27d6227f154f624123d7d35f7327ec7e4e7bbb6c725aa61fc2c4f7245787909"} Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.013030 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft" event={"ID":"af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb","Type":"ContainerStarted","Data":"0de7beeb8a712875d950baac221133ed19e3e5ceffda93614133e24a32a443af"} Feb 17 15:08:12 crc kubenswrapper[4717]: E0217 15:08:12.013507 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" podUID="aaf337fd-63ab-42de-a465-02fecc40116b" Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.015456 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" event={"ID":"ae48b931-1ac6-43ed-a407-b3fbb3d56178","Type":"ContainerStarted","Data":"ce10f3faeb4a00fc24baac31d8bc55b95db884f5a60932e320ea8a5bd51f69f1"} Feb 17 15:08:12 crc kubenswrapper[4717]: E0217 15:08:12.017830 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" podUID="ae48b931-1ac6-43ed-a407-b3fbb3d56178" Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.042214 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d" event={"ID":"dd3497e6-6de6-4bdf-a23e-16adc21de6ab","Type":"ContainerStarted","Data":"9a5cc601508c5c2ef3aac9731c84e245668124763746df86d9cdd709964e57d3"} Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.045629 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6" event={"ID":"6edcc364-656f-4f2d-aa9a-3409b3b58471","Type":"ContainerStarted","Data":"bbba0fd28bb9b212dfe7dd8bdbb1315c52c3d5b19489a8b897da5d2ab40bf228"} Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.074345 4717 generic.go:334] "Generic (PLEG): container finished" podID="106d09b9-99cf-4500-88db-c6e8121491b6" containerID="f7b4df57142a26768ba61fa1ea391f8241b8927c3a6266cda42adc0ba3286016" exitCode=0 Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.074394 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjljp" event={"ID":"106d09b9-99cf-4500-88db-c6e8121491b6","Type":"ContainerDied","Data":"f7b4df57142a26768ba61fa1ea391f8241b8927c3a6266cda42adc0ba3286016"} Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.137002 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.171882 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/106d09b9-99cf-4500-88db-c6e8121491b6-utilities\") pod \"106d09b9-99cf-4500-88db-c6e8121491b6\" (UID: \"106d09b9-99cf-4500-88db-c6e8121491b6\") " Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.172032 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xwjn\" (UniqueName: \"kubernetes.io/projected/106d09b9-99cf-4500-88db-c6e8121491b6-kube-api-access-6xwjn\") pod \"106d09b9-99cf-4500-88db-c6e8121491b6\" (UID: \"106d09b9-99cf-4500-88db-c6e8121491b6\") " Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.172175 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/106d09b9-99cf-4500-88db-c6e8121491b6-catalog-content\") pod \"106d09b9-99cf-4500-88db-c6e8121491b6\" (UID: \"106d09b9-99cf-4500-88db-c6e8121491b6\") " Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.175072 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/106d09b9-99cf-4500-88db-c6e8121491b6-utilities" (OuterVolumeSpecName: "utilities") pod "106d09b9-99cf-4500-88db-c6e8121491b6" (UID: "106d09b9-99cf-4500-88db-c6e8121491b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.194987 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106d09b9-99cf-4500-88db-c6e8121491b6-kube-api-access-6xwjn" (OuterVolumeSpecName: "kube-api-access-6xwjn") pod "106d09b9-99cf-4500-88db-c6e8121491b6" (UID: "106d09b9-99cf-4500-88db-c6e8121491b6"). InnerVolumeSpecName "kube-api-access-6xwjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.233437 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/106d09b9-99cf-4500-88db-c6e8121491b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "106d09b9-99cf-4500-88db-c6e8121491b6" (UID: "106d09b9-99cf-4500-88db-c6e8121491b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.274361 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xwjn\" (UniqueName: \"kubernetes.io/projected/106d09b9-99cf-4500-88db-c6e8121491b6-kube-api-access-6xwjn\") on node \"crc\" DevicePath \"\"" Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.274399 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/106d09b9-99cf-4500-88db-c6e8121491b6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.274414 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/106d09b9-99cf-4500-88db-c6e8121491b6-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:08:12 crc kubenswrapper[4717]: I0217 15:08:12.985314 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert\") pod \"infra-operator-controller-manager-79d975b745-d6ssr\" (UID: \"9aa57429-09b8-4262-8356-fd8ea486b236\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:12 crc kubenswrapper[4717]: E0217 15:08:12.985525 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 15:08:12 crc kubenswrapper[4717]: E0217 15:08:12.985623 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert podName:9aa57429-09b8-4262-8356-fd8ea486b236 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:16.98559862 +0000 UTC m=+963.401439096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert") pod "infra-operator-controller-manager-79d975b745-d6ssr" (UID: "9aa57429-09b8-4262-8356-fd8ea486b236") : secret "infra-operator-webhook-server-cert" not found Feb 17 15:08:13 crc kubenswrapper[4717]: I0217 15:08:13.104156 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjljp" event={"ID":"106d09b9-99cf-4500-88db-c6e8121491b6","Type":"ContainerDied","Data":"929ac92c72d3a1ab10bd052521d0b0d85d6a3874004e9100f7ed79e758918067"} Feb 17 15:08:13 crc kubenswrapper[4717]: I0217 15:08:13.104235 4717 scope.go:117] "RemoveContainer" containerID="f7b4df57142a26768ba61fa1ea391f8241b8927c3a6266cda42adc0ba3286016" Feb 17 15:08:13 crc kubenswrapper[4717]: I0217 15:08:13.104521 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjljp" Feb 17 15:08:13 crc kubenswrapper[4717]: E0217 15:08:13.116436 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" podUID="ae48b931-1ac6-43ed-a407-b3fbb3d56178" Feb 17 15:08:13 crc kubenswrapper[4717]: E0217 15:08:13.117032 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" podUID="514fed4f-53b1-4b52-8b25-7e4ec648e155" Feb 17 15:08:13 crc kubenswrapper[4717]: E0217 15:08:13.117335 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" podUID="2872debe-d42d-4955-bcca-5006aa7a2ecc" Feb 17 15:08:13 crc kubenswrapper[4717]: E0217 15:08:13.117406 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" podUID="3a97433f-8ce2-446e-92ef-170a4996ffe8" Feb 17 15:08:13 crc kubenswrapper[4717]: E0217 15:08:13.117817 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" podUID="5fdc2829-32d7-456a-99f3-e15b957b272e" Feb 17 15:08:13 crc kubenswrapper[4717]: E0217 15:08:13.133438 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" podUID="aaf337fd-63ab-42de-a465-02fecc40116b" Feb 17 15:08:13 crc kubenswrapper[4717]: I0217 15:08:13.240544 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjljp"] Feb 17 15:08:13 crc kubenswrapper[4717]: I0217 15:08:13.247051 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjljp"] Feb 17 15:08:13 crc kubenswrapper[4717]: I0217 15:08:13.394104 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cm8796\" (UID: \"b4295f2b-c9b6-4604-b910-525c07cca2ed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:13 crc kubenswrapper[4717]: E0217 15:08:13.394314 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 15:08:13 crc kubenswrapper[4717]: E0217 15:08:13.394372 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert podName:b4295f2b-c9b6-4604-b910-525c07cca2ed nodeName:}" failed. No retries permitted until 2026-02-17 15:08:17.394356881 +0000 UTC m=+963.810197357 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" (UID: "b4295f2b-c9b6-4604-b910-525c07cca2ed") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 15:08:13 crc kubenswrapper[4717]: I0217 15:08:13.699344 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:13 crc kubenswrapper[4717]: E0217 15:08:13.699578 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 15:08:13 crc kubenswrapper[4717]: E0217 15:08:13.699926 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs podName:f2279bdf-746c-4e8c-8703-74a256bd7923 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:17.699899504 +0000 UTC m=+964.115739980 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs") pod "openstack-operator-controller-manager-7d87bc949d-9mr4h" (UID: "f2279bdf-746c-4e8c-8703-74a256bd7923") : secret "webhook-server-cert" not found Feb 17 15:08:13 crc kubenswrapper[4717]: E0217 15:08:13.699959 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 15:08:13 crc kubenswrapper[4717]: E0217 15:08:13.700030 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs podName:f2279bdf-746c-4e8c-8703-74a256bd7923 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:17.700013358 +0000 UTC m=+964.115853834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs") pod "openstack-operator-controller-manager-7d87bc949d-9mr4h" (UID: "f2279bdf-746c-4e8c-8703-74a256bd7923") : secret "metrics-server-cert" not found Feb 17 15:08:13 crc kubenswrapper[4717]: I0217 15:08:13.700478 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:13 crc kubenswrapper[4717]: I0217 15:08:13.857799 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106d09b9-99cf-4500-88db-c6e8121491b6" path="/var/lib/kubelet/pods/106d09b9-99cf-4500-88db-c6e8121491b6/volumes" Feb 17 15:08:17 crc kubenswrapper[4717]: I0217 15:08:17.064196 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert\") pod \"infra-operator-controller-manager-79d975b745-d6ssr\" (UID: \"9aa57429-09b8-4262-8356-fd8ea486b236\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:17 crc kubenswrapper[4717]: E0217 15:08:17.064464 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 15:08:17 crc kubenswrapper[4717]: E0217 15:08:17.064758 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert podName:9aa57429-09b8-4262-8356-fd8ea486b236 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:25.064727774 +0000 UTC m=+971.480568260 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert") pod "infra-operator-controller-manager-79d975b745-d6ssr" (UID: "9aa57429-09b8-4262-8356-fd8ea486b236") : secret "infra-operator-webhook-server-cert" not found Feb 17 15:08:17 crc kubenswrapper[4717]: I0217 15:08:17.478696 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cm8796\" (UID: \"b4295f2b-c9b6-4604-b910-525c07cca2ed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:17 crc kubenswrapper[4717]: E0217 15:08:17.479043 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 15:08:17 crc kubenswrapper[4717]: E0217 15:08:17.479151 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert podName:b4295f2b-c9b6-4604-b910-525c07cca2ed nodeName:}" failed. No retries permitted until 2026-02-17 15:08:25.479124675 +0000 UTC m=+971.894965191 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" (UID: "b4295f2b-c9b6-4604-b910-525c07cca2ed") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 15:08:17 crc kubenswrapper[4717]: I0217 15:08:17.784024 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:17 crc kubenswrapper[4717]: I0217 15:08:17.784164 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:17 crc kubenswrapper[4717]: E0217 15:08:17.784376 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 15:08:17 crc kubenswrapper[4717]: E0217 15:08:17.784412 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 15:08:17 crc kubenswrapper[4717]: E0217 15:08:17.784469 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs podName:f2279bdf-746c-4e8c-8703-74a256bd7923 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:25.784443162 +0000 UTC m=+972.200283638 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs") pod "openstack-operator-controller-manager-7d87bc949d-9mr4h" (UID: "f2279bdf-746c-4e8c-8703-74a256bd7923") : secret "metrics-server-cert" not found Feb 17 15:08:17 crc kubenswrapper[4717]: E0217 15:08:17.784493 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs podName:f2279bdf-746c-4e8c-8703-74a256bd7923 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:25.784483203 +0000 UTC m=+972.200323669 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs") pod "openstack-operator-controller-manager-7d87bc949d-9mr4h" (UID: "f2279bdf-746c-4e8c-8703-74a256bd7923") : secret "webhook-server-cert" not found Feb 17 15:08:22 crc kubenswrapper[4717]: E0217 15:08:22.232919 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df" Feb 17 15:08:22 crc kubenswrapper[4717]: E0217 15:08:22.235437 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qgctk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987464f4-qdg4j_openstack-operators(0004ea51-4233-47ad-a9d9-8e5d745a55f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:08:22 crc kubenswrapper[4717]: E0217 15:08:22.236693 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j" podUID="0004ea51-4233-47ad-a9d9-8e5d745a55f8" Feb 17 15:08:23 crc kubenswrapper[4717]: E0217 15:08:23.751474 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j" podUID="0004ea51-4233-47ad-a9d9-8e5d745a55f8" Feb 17 15:08:24 crc kubenswrapper[4717]: E0217 15:08:24.672866 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99" Feb 17 15:08:24 crc kubenswrapper[4717]: E0217 15:08:24.673144 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ln5qz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-spbft_openstack-operators(af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:08:24 crc kubenswrapper[4717]: E0217 15:08:24.674438 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft" podUID="af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb" Feb 17 15:08:25 crc kubenswrapper[4717]: I0217 15:08:25.108716 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert\") pod \"infra-operator-controller-manager-79d975b745-d6ssr\" (UID: \"9aa57429-09b8-4262-8356-fd8ea486b236\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:25 crc kubenswrapper[4717]: I0217 15:08:25.120840 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aa57429-09b8-4262-8356-fd8ea486b236-cert\") pod \"infra-operator-controller-manager-79d975b745-d6ssr\" (UID: \"9aa57429-09b8-4262-8356-fd8ea486b236\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.221448 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34" Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.221659 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qrfqh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-mspl4_openstack-operators(3dcfb2d1-e765-4eb0-8300-f7567a34cae7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.222899 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4" podUID="3dcfb2d1-e765-4eb0-8300-f7567a34cae7" Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.247035 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft" podUID="af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb" Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.247892 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4" podUID="3dcfb2d1-e765-4eb0-8300-f7567a34cae7" Feb 17 15:08:25 crc kubenswrapper[4717]: I0217 15:08:25.250845 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kt6mm" Feb 17 15:08:25 crc kubenswrapper[4717]: I0217 15:08:25.259320 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:25 crc kubenswrapper[4717]: I0217 15:08:25.517962 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cm8796\" (UID: \"b4295f2b-c9b6-4604-b910-525c07cca2ed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.518154 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.518494 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert podName:b4295f2b-c9b6-4604-b910-525c07cca2ed nodeName:}" failed. No retries permitted until 2026-02-17 15:08:41.51847563 +0000 UTC m=+987.934316106 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" (UID: "b4295f2b-c9b6-4604-b910-525c07cca2ed") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 15:08:25 crc kubenswrapper[4717]: I0217 15:08:25.824980 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:25 crc kubenswrapper[4717]: I0217 15:08:25.825139 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.825272 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.825320 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs podName:f2279bdf-746c-4e8c-8703-74a256bd7923 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:41.825306221 +0000 UTC m=+988.241146697 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs") pod "openstack-operator-controller-manager-7d87bc949d-9mr4h" (UID: "f2279bdf-746c-4e8c-8703-74a256bd7923") : secret "webhook-server-cert" not found Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.825674 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.825703 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs podName:f2279bdf-746c-4e8c-8703-74a256bd7923 nodeName:}" failed. No retries permitted until 2026-02-17 15:08:41.825696252 +0000 UTC m=+988.241536728 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs") pod "openstack-operator-controller-manager-7d87bc949d-9mr4h" (UID: "f2279bdf-746c-4e8c-8703-74a256bd7923") : secret "metrics-server-cert" not found Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.904763 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979" Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.905035 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-47t98,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-5d946d989d-5rhm9_openstack-operators(aee98c41-a3b5-43ba-b272-279e5836df0b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:08:25 crc kubenswrapper[4717]: E0217 15:08:25.906387 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9" podUID="aee98c41-a3b5-43ba-b272-279e5836df0b" Feb 17 15:08:25 crc kubenswrapper[4717]: I0217 15:08:25.908119 4717 scope.go:117] "RemoveContainer" containerID="01d7669250069bed1a763163864e388d184e9fc39770d2c3dfbb7d2e0819cdad" Feb 17 15:08:26 crc kubenswrapper[4717]: E0217 15:08:26.261597 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9" podUID="aee98c41-a3b5-43ba-b272-279e5836df0b" Feb 17 15:08:26 crc kubenswrapper[4717]: E0217 15:08:26.549916 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759" Feb 17 15:08:26 crc kubenswrapper[4717]: E0217 15:08:26.550236 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k4kc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-x6nmz_openstack-operators(8a4392b6-6232-4135-91f9-676c565446fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:08:26 crc kubenswrapper[4717]: E0217 15:08:26.551430 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz" podUID="8a4392b6-6232-4135-91f9-676c565446fc" Feb 17 15:08:27 crc kubenswrapper[4717]: E0217 15:08:27.266688 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz" podUID="8a4392b6-6232-4135-91f9-676c565446fc" Feb 17 15:08:28 crc kubenswrapper[4717]: E0217 15:08:28.676485 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 17 15:08:28 crc kubenswrapper[4717]: E0217 15:08:28.677216 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gc8s4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-2w74d_openstack-operators(dd3497e6-6de6-4bdf-a23e-16adc21de6ab): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:08:28 crc kubenswrapper[4717]: E0217 15:08:28.678451 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d" podUID="dd3497e6-6de6-4bdf-a23e-16adc21de6ab" Feb 17 15:08:29 crc kubenswrapper[4717]: E0217 15:08:29.284704 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d" podUID="dd3497e6-6de6-4bdf-a23e-16adc21de6ab" Feb 17 15:08:30 crc kubenswrapper[4717]: E0217 15:08:30.631974 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 17 15:08:30 crc kubenswrapper[4717]: E0217 15:08:30.632823 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vknvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-gmfn2_openstack-operators(22accf0b-8a4d-478a-bc45-d5bd4aa45b87): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:08:30 crc kubenswrapper[4717]: E0217 15:08:30.634651 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2" podUID="22accf0b-8a4d-478a-bc45-d5bd4aa45b87" Feb 17 15:08:31 crc kubenswrapper[4717]: E0217 15:08:31.147064 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:e8a675284ff97a1d3f0f07583863be20b20b4aa48ebb34dbc80d83fe39d757b2" Feb 17 15:08:31 crc kubenswrapper[4717]: E0217 15:08:31.147317 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:e8a675284ff97a1d3f0f07583863be20b20b4aa48ebb34dbc80d83fe39d757b2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kl84s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-69f49c598c-5d768_openstack-operators(c12ec16a-c8fd-48ae-8c86-257bcef97050): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:08:31 crc kubenswrapper[4717]: E0217 15:08:31.149409 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-5d768" podUID="c12ec16a-c8fd-48ae-8c86-257bcef97050" Feb 17 15:08:31 crc kubenswrapper[4717]: E0217 15:08:31.302860 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2" podUID="22accf0b-8a4d-478a-bc45-d5bd4aa45b87" Feb 17 15:08:31 crc kubenswrapper[4717]: E0217 15:08:31.303489 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:e8a675284ff97a1d3f0f07583863be20b20b4aa48ebb34dbc80d83fe39d757b2\\\"\"" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-5d768" podUID="c12ec16a-c8fd-48ae-8c86-257bcef97050" Feb 17 15:08:33 crc kubenswrapper[4717]: E0217 15:08:33.091750 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 17 15:08:33 crc kubenswrapper[4717]: E0217 15:08:33.092426 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hk58d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-d7sx6_openstack-operators(6edcc364-656f-4f2d-aa9a-3409b3b58471): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:08:33 crc kubenswrapper[4717]: E0217 15:08:33.093678 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6" podUID="6edcc364-656f-4f2d-aa9a-3409b3b58471" Feb 17 15:08:33 crc kubenswrapper[4717]: E0217 15:08:33.323123 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6" podUID="6edcc364-656f-4f2d-aa9a-3409b3b58471" Feb 17 15:08:35 crc kubenswrapper[4717]: I0217 15:08:35.445535 4717 scope.go:117] "RemoveContainer" containerID="7d4e5dab5e0f52d07ea0d2b35b082437b5dab70e8747f631cffa723f1e3cb2d3" Feb 17 15:08:36 crc kubenswrapper[4717]: I0217 15:08:36.661708 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr"] Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.350622 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" event={"ID":"5fdc2829-32d7-456a-99f3-e15b957b272e","Type":"ContainerStarted","Data":"1525fbe893135e96fe81b0a15b0162cc7c35bb18d678c583183a32e96a6f66d9"} Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.351531 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.352568 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" event={"ID":"aaf337fd-63ab-42de-a465-02fecc40116b","Type":"ContainerStarted","Data":"53eb922888395970b198e4b8a4a7ea731dcb96cc791a3ded8b0ee7714dadd233"} Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.352972 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.354349 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" event={"ID":"3a97433f-8ce2-446e-92ef-170a4996ffe8","Type":"ContainerStarted","Data":"a4692cdc0d5ceefa660532308d46dba16d4eb0c45e003c5cb6f7bac8b17b2052"} Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.355124 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.355519 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" event={"ID":"9aa57429-09b8-4262-8356-fd8ea486b236","Type":"ContainerStarted","Data":"160a0edaddb1ab3b3de2cf502413420b6726bce964986e380038258640a9d0ac"} Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.356808 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2" event={"ID":"5390fc8f-fcef-4a64-8d3c-4ba8e9b2f86d","Type":"ContainerStarted","Data":"f178c761bd6f0915ba79ceaa2a747580b29370c0bae78e4b2f96736be030c712"} Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.357257 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.358690 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" event={"ID":"2872debe-d42d-4955-bcca-5006aa7a2ecc","Type":"ContainerStarted","Data":"b345ca48896e0c3f43d522d2a0ed78fd60a97c63e1873394680cd491aa7b75d6"} Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.359044 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.360052 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq" event={"ID":"563dc2c2-73a1-485a-ab9e-6f7e0b3423cb","Type":"ContainerStarted","Data":"4797fccc155ce76ac9acadcc36843e758220ad289a91e3d9cc74e56f655b8d0c"} Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.360181 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.361231 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf" event={"ID":"55e71e0a-9623-4049-b828-77040b5dd36e","Type":"ContainerStarted","Data":"65ca831a890e19937d8ac3670817a636a5fbea3601f21f79b6774127126ecb37"} Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.361366 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.362863 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" event={"ID":"514fed4f-53b1-4b52-8b25-7e4ec648e155","Type":"ContainerStarted","Data":"e5cdc6fdc2601804aa53fbbe71c6b3bb0aad74c57585b456dea740ffe4d91382"} Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.365205 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.381747 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh" event={"ID":"cd977519-10c4-4afe-8f51-52f6cab597f9","Type":"ContainerStarted","Data":"e4c7c77ea0156ca89068c54ddf7d2a12ce327b9fc3ed7490e440e6782454c074"} Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.382002 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.397697 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" event={"ID":"ae48b931-1ac6-43ed-a407-b3fbb3d56178","Type":"ContainerStarted","Data":"d515d9b394ced53dba6a12bbb172206d2f39041a331495fba2dda450f0499468"} Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.398467 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.451235 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" podStartSLOduration=3.239992923 podStartE2EDuration="28.451205603s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:11.078393193 +0000 UTC m=+957.494233669" lastFinishedPulling="2026-02-17 15:08:36.289605863 +0000 UTC m=+982.705446349" observedRunningTime="2026-02-17 15:08:37.450303637 +0000 UTC m=+983.866144123" watchObservedRunningTime="2026-02-17 15:08:37.451205603 +0000 UTC m=+983.867046079" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.476444 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d" event={"ID":"9bc8e53b-549b-48d9-810f-25ce640b7339","Type":"ContainerStarted","Data":"4bc04426ba41eb34b9fd611a644a4b21d328b7bab0b35cbfb0475db13d2632ce"} Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.479991 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.526127 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh" podStartSLOduration=6.951034646 podStartE2EDuration="29.526101398s" podCreationTimestamp="2026-02-17 15:08:08 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.484219978 +0000 UTC m=+956.900060454" lastFinishedPulling="2026-02-17 15:08:33.05928673 +0000 UTC m=+979.475127206" observedRunningTime="2026-02-17 15:08:37.511268294 +0000 UTC m=+983.927108770" watchObservedRunningTime="2026-02-17 15:08:37.526101398 +0000 UTC m=+983.941941884" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.563518 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" podStartSLOduration=3.207299335 podStartE2EDuration="28.56349814s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.931947756 +0000 UTC m=+957.347788232" lastFinishedPulling="2026-02-17 15:08:36.288146541 +0000 UTC m=+982.703987037" observedRunningTime="2026-02-17 15:08:37.562618224 +0000 UTC m=+983.978458720" watchObservedRunningTime="2026-02-17 15:08:37.56349814 +0000 UTC m=+983.979338616" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.599567 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" podStartSLOduration=3.5036454949999998 podStartE2EDuration="28.599547232s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:11.081195243 +0000 UTC m=+957.497035719" lastFinishedPulling="2026-02-17 15:08:36.17709698 +0000 UTC m=+982.592937456" observedRunningTime="2026-02-17 15:08:37.587206869 +0000 UTC m=+984.003047355" watchObservedRunningTime="2026-02-17 15:08:37.599547232 +0000 UTC m=+984.015387708" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.617494 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf" podStartSLOduration=7.015879959 podStartE2EDuration="29.617473876s" podCreationTimestamp="2026-02-17 15:08:08 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.456916681 +0000 UTC m=+956.872757157" lastFinishedPulling="2026-02-17 15:08:33.058510598 +0000 UTC m=+979.474351074" observedRunningTime="2026-02-17 15:08:37.615121248 +0000 UTC m=+984.030961744" watchObservedRunningTime="2026-02-17 15:08:37.617473876 +0000 UTC m=+984.033314352" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.648629 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" podStartSLOduration=3.5526841 podStartE2EDuration="28.648600947s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:11.079889826 +0000 UTC m=+957.495730302" lastFinishedPulling="2026-02-17 15:08:36.175806673 +0000 UTC m=+982.591647149" observedRunningTime="2026-02-17 15:08:37.646320142 +0000 UTC m=+984.062160618" watchObservedRunningTime="2026-02-17 15:08:37.648600947 +0000 UTC m=+984.064441443" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.678758 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" podStartSLOduration=3.472289726 podStartE2EDuration="28.67874047s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:11.082560032 +0000 UTC m=+957.498400498" lastFinishedPulling="2026-02-17 15:08:36.289010766 +0000 UTC m=+982.704851242" observedRunningTime="2026-02-17 15:08:37.674295603 +0000 UTC m=+984.090136079" watchObservedRunningTime="2026-02-17 15:08:37.67874047 +0000 UTC m=+984.094580936" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.703036 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" podStartSLOduration=3.49126901 podStartE2EDuration="28.703008715s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:11.078851346 +0000 UTC m=+957.494691822" lastFinishedPulling="2026-02-17 15:08:36.290591041 +0000 UTC m=+982.706431527" observedRunningTime="2026-02-17 15:08:37.696208611 +0000 UTC m=+984.112049097" watchObservedRunningTime="2026-02-17 15:08:37.703008715 +0000 UTC m=+984.118849191" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.730559 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2" podStartSLOduration=6.352461937 podStartE2EDuration="28.730536694s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.67937729 +0000 UTC m=+957.095217766" lastFinishedPulling="2026-02-17 15:08:33.057452047 +0000 UTC m=+979.473292523" observedRunningTime="2026-02-17 15:08:37.719878849 +0000 UTC m=+984.135719345" watchObservedRunningTime="2026-02-17 15:08:37.730536694 +0000 UTC m=+984.146377170" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.754217 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq" podStartSLOduration=6.38896681 podStartE2EDuration="28.754196132s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.692254826 +0000 UTC m=+957.108095302" lastFinishedPulling="2026-02-17 15:08:33.057484118 +0000 UTC m=+979.473324624" observedRunningTime="2026-02-17 15:08:37.753500532 +0000 UTC m=+984.169341018" watchObservedRunningTime="2026-02-17 15:08:37.754196132 +0000 UTC m=+984.170036608" Feb 17 15:08:37 crc kubenswrapper[4717]: I0217 15:08:37.785862 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d" podStartSLOduration=6.411632511 podStartE2EDuration="28.785842818s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.682709535 +0000 UTC m=+957.098550011" lastFinishedPulling="2026-02-17 15:08:33.056919842 +0000 UTC m=+979.472760318" observedRunningTime="2026-02-17 15:08:37.781846234 +0000 UTC m=+984.197686710" watchObservedRunningTime="2026-02-17 15:08:37.785842818 +0000 UTC m=+984.201683294" Feb 17 15:08:38 crc kubenswrapper[4717]: I0217 15:08:38.486920 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft" event={"ID":"af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb","Type":"ContainerStarted","Data":"a4e446e3a458499ff27f61c3f81542b3770a4be556cb6a95dc3aec3a763c4468"} Feb 17 15:08:38 crc kubenswrapper[4717]: I0217 15:08:38.510716 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft" podStartSLOduration=3.03666761 podStartE2EDuration="29.510695969s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:11.078313151 +0000 UTC m=+957.494153617" lastFinishedPulling="2026-02-17 15:08:37.5523415 +0000 UTC m=+983.968181976" observedRunningTime="2026-02-17 15:08:38.502790423 +0000 UTC m=+984.918630909" watchObservedRunningTime="2026-02-17 15:08:38.510695969 +0000 UTC m=+984.926536445" Feb 17 15:08:40 crc kubenswrapper[4717]: I0217 15:08:40.097602 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft" Feb 17 15:08:40 crc kubenswrapper[4717]: I0217 15:08:40.508765 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9" event={"ID":"aee98c41-a3b5-43ba-b272-279e5836df0b","Type":"ContainerStarted","Data":"6fa62c6fe4cc0ee9e93eedc1b7760eb114f46fa8fe1c5863adfe7cea83717f88"} Feb 17 15:08:40 crc kubenswrapper[4717]: I0217 15:08:40.509699 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9" Feb 17 15:08:40 crc kubenswrapper[4717]: I0217 15:08:40.510456 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j" event={"ID":"0004ea51-4233-47ad-a9d9-8e5d745a55f8","Type":"ContainerStarted","Data":"cbea0839a6879973e7e4c0fa329a71c85b04c2b0e3461e591b3f229ad212fc80"} Feb 17 15:08:40 crc kubenswrapper[4717]: I0217 15:08:40.510676 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j" Feb 17 15:08:40 crc kubenswrapper[4717]: I0217 15:08:40.513128 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" event={"ID":"9aa57429-09b8-4262-8356-fd8ea486b236","Type":"ContainerStarted","Data":"d1625b9eba32626a0efb10f9ae4fd110432faacb72783d3c08fcd039687a98e0"} Feb 17 15:08:40 crc kubenswrapper[4717]: I0217 15:08:40.513293 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:40 crc kubenswrapper[4717]: I0217 15:08:40.535370 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9" podStartSLOduration=2.929487192 podStartE2EDuration="32.535342639s" podCreationTimestamp="2026-02-17 15:08:08 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.479242746 +0000 UTC m=+956.895083222" lastFinishedPulling="2026-02-17 15:08:40.085098193 +0000 UTC m=+986.500938669" observedRunningTime="2026-02-17 15:08:40.528360469 +0000 UTC m=+986.944200965" watchObservedRunningTime="2026-02-17 15:08:40.535342639 +0000 UTC m=+986.951183115" Feb 17 15:08:40 crc kubenswrapper[4717]: I0217 15:08:40.557467 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" podStartSLOduration=28.162326989 podStartE2EDuration="31.557445522s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:36.686382017 +0000 UTC m=+983.102222493" lastFinishedPulling="2026-02-17 15:08:40.08150055 +0000 UTC m=+986.497341026" observedRunningTime="2026-02-17 15:08:40.553397686 +0000 UTC m=+986.969238202" watchObservedRunningTime="2026-02-17 15:08:40.557445522 +0000 UTC m=+986.973285998" Feb 17 15:08:40 crc kubenswrapper[4717]: I0217 15:08:40.575325 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j" podStartSLOduration=3.051399338 podStartE2EDuration="32.575295933s" podCreationTimestamp="2026-02-17 15:08:08 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.55812547 +0000 UTC m=+956.973965946" lastFinishedPulling="2026-02-17 15:08:40.082022065 +0000 UTC m=+986.497862541" observedRunningTime="2026-02-17 15:08:40.570750863 +0000 UTC m=+986.986591349" watchObservedRunningTime="2026-02-17 15:08:40.575295933 +0000 UTC m=+986.991136409" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.522636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz" event={"ID":"8a4392b6-6232-4135-91f9-676c565446fc","Type":"ContainerStarted","Data":"3659a336a34f4057369b64ed5aafa6804b082dfecbd63d9c1dc66a0e41280ad0"} Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.523482 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.525582 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4" event={"ID":"3dcfb2d1-e765-4eb0-8300-f7567a34cae7","Type":"ContainerStarted","Data":"febe389ac0c301432b0ea988acea39f42dc0acd8b71d395670f961e4e3f49c28"} Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.549322 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz" podStartSLOduration=2.8674904039999998 podStartE2EDuration="32.549298731s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.926714318 +0000 UTC m=+957.342554794" lastFinishedPulling="2026-02-17 15:08:40.608522645 +0000 UTC m=+987.024363121" observedRunningTime="2026-02-17 15:08:41.544775891 +0000 UTC m=+987.960616377" watchObservedRunningTime="2026-02-17 15:08:41.549298731 +0000 UTC m=+987.965139217" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.564050 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4" podStartSLOduration=2.417465097 podStartE2EDuration="32.564029673s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.912322778 +0000 UTC m=+957.328163254" lastFinishedPulling="2026-02-17 15:08:41.058887354 +0000 UTC m=+987.474727830" observedRunningTime="2026-02-17 15:08:41.562564831 +0000 UTC m=+987.978405337" watchObservedRunningTime="2026-02-17 15:08:41.564029673 +0000 UTC m=+987.979870169" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.605824 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cm8796\" (UID: \"b4295f2b-c9b6-4604-b910-525c07cca2ed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.621192 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4295f2b-c9b6-4604-b910-525c07cca2ed-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cm8796\" (UID: \"b4295f2b-c9b6-4604-b910-525c07cca2ed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.803866 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-ph47k" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.812336 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.911149 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.911728 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.916827 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-metrics-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.917210 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2279bdf-746c-4e8c-8703-74a256bd7923-webhook-certs\") pod \"openstack-operator-controller-manager-7d87bc949d-9mr4h\" (UID: \"f2279bdf-746c-4e8c-8703-74a256bd7923\") " pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.932074 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wsqnw" Feb 17 15:08:41 crc kubenswrapper[4717]: I0217 15:08:41.942053 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:42 crc kubenswrapper[4717]: I0217 15:08:42.237790 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h"] Feb 17 15:08:42 crc kubenswrapper[4717]: W0217 15:08:42.245557 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2279bdf_746c_4e8c_8703_74a256bd7923.slice/crio-cf03f7a4080f3245860d6b3097e1d5a90fbebe2ec1f750e3b7008cf0f37f6c6a WatchSource:0}: Error finding container cf03f7a4080f3245860d6b3097e1d5a90fbebe2ec1f750e3b7008cf0f37f6c6a: Status 404 returned error can't find the container with id cf03f7a4080f3245860d6b3097e1d5a90fbebe2ec1f750e3b7008cf0f37f6c6a Feb 17 15:08:42 crc kubenswrapper[4717]: I0217 15:08:42.330425 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796"] Feb 17 15:08:42 crc kubenswrapper[4717]: I0217 15:08:42.534533 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" event={"ID":"b4295f2b-c9b6-4604-b910-525c07cca2ed","Type":"ContainerStarted","Data":"5dc285e1c96629e65ad4e5657bd08ddb9890a3956d29208e274e80eac1e448f5"} Feb 17 15:08:42 crc kubenswrapper[4717]: I0217 15:08:42.536433 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" event={"ID":"f2279bdf-746c-4e8c-8703-74a256bd7923","Type":"ContainerStarted","Data":"4d96386f3cf8f044f7d2aa0e5b37e6988fc7d836fed6f80f17aaab2c911fdb63"} Feb 17 15:08:42 crc kubenswrapper[4717]: I0217 15:08:42.536511 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" event={"ID":"f2279bdf-746c-4e8c-8703-74a256bd7923","Type":"ContainerStarted","Data":"cf03f7a4080f3245860d6b3097e1d5a90fbebe2ec1f750e3b7008cf0f37f6c6a"} Feb 17 15:08:43 crc kubenswrapper[4717]: I0217 15:08:43.547330 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d" event={"ID":"dd3497e6-6de6-4bdf-a23e-16adc21de6ab","Type":"ContainerStarted","Data":"5ed5c994fae655cb1d8a9bf2373583e52d615623b4ba3ca0e4c6d8d08b76d749"} Feb 17 15:08:43 crc kubenswrapper[4717]: I0217 15:08:43.547923 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:43 crc kubenswrapper[4717]: I0217 15:08:43.574267 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2w74d" podStartSLOduration=3.324954106 podStartE2EDuration="34.574244859s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:11.294542013 +0000 UTC m=+957.710382499" lastFinishedPulling="2026-02-17 15:08:42.543832786 +0000 UTC m=+988.959673252" observedRunningTime="2026-02-17 15:08:43.569028479 +0000 UTC m=+989.984868965" watchObservedRunningTime="2026-02-17 15:08:43.574244859 +0000 UTC m=+989.990085325" Feb 17 15:08:43 crc kubenswrapper[4717]: I0217 15:08:43.585771 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" podStartSLOduration=34.585738808 podStartE2EDuration="34.585738808s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:08:42.569294876 +0000 UTC m=+988.985135372" watchObservedRunningTime="2026-02-17 15:08:43.585738808 +0000 UTC m=+990.001579284" Feb 17 15:08:45 crc kubenswrapper[4717]: I0217 15:08:45.266943 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-d6ssr" Feb 17 15:08:45 crc kubenswrapper[4717]: I0217 15:08:45.570113 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" event={"ID":"b4295f2b-c9b6-4604-b910-525c07cca2ed","Type":"ContainerStarted","Data":"cd20f24ddc8bd81f1ace568f2937a715da60f3511d8857607b13184035860fa6"} Feb 17 15:08:45 crc kubenswrapper[4717]: I0217 15:08:45.570289 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:45 crc kubenswrapper[4717]: I0217 15:08:45.603045 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" podStartSLOduration=33.915842951 podStartE2EDuration="36.603019437s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:42.344546928 +0000 UTC m=+988.760387414" lastFinishedPulling="2026-02-17 15:08:45.031723424 +0000 UTC m=+991.447563900" observedRunningTime="2026-02-17 15:08:45.601074391 +0000 UTC m=+992.016914887" watchObservedRunningTime="2026-02-17 15:08:45.603019437 +0000 UTC m=+992.018859913" Feb 17 15:08:46 crc kubenswrapper[4717]: I0217 15:08:46.581185 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2" event={"ID":"22accf0b-8a4d-478a-bc45-d5bd4aa45b87","Type":"ContainerStarted","Data":"4d3d68844b9863bdcc2e34cb189823ceb963400eeb76fffc6dd52e6237b900e9"} Feb 17 15:08:46 crc kubenswrapper[4717]: I0217 15:08:46.581907 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2" Feb 17 15:08:46 crc kubenswrapper[4717]: I0217 15:08:46.601548 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2" podStartSLOduration=2.758469264 podStartE2EDuration="37.601527336s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.908268103 +0000 UTC m=+957.324108599" lastFinishedPulling="2026-02-17 15:08:45.751326195 +0000 UTC m=+992.167166671" observedRunningTime="2026-02-17 15:08:46.596615355 +0000 UTC m=+993.012455831" watchObservedRunningTime="2026-02-17 15:08:46.601527336 +0000 UTC m=+993.017367812" Feb 17 15:08:47 crc kubenswrapper[4717]: I0217 15:08:47.590749 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-5d768" event={"ID":"c12ec16a-c8fd-48ae-8c86-257bcef97050","Type":"ContainerStarted","Data":"d28afd2c6444cb8c4bf934c770e519b7fd1610963aee1bf6c2bc78a90c6baeba"} Feb 17 15:08:47 crc kubenswrapper[4717]: I0217 15:08:47.591408 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-5d768" Feb 17 15:08:47 crc kubenswrapper[4717]: I0217 15:08:47.618578 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-5d768" podStartSLOduration=2.596618131 podStartE2EDuration="38.618553516s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.683789505 +0000 UTC m=+957.099629981" lastFinishedPulling="2026-02-17 15:08:46.70572488 +0000 UTC m=+993.121565366" observedRunningTime="2026-02-17 15:08:47.613268234 +0000 UTC m=+994.029108720" watchObservedRunningTime="2026-02-17 15:08:47.618553516 +0000 UTC m=+994.034393992" Feb 17 15:08:49 crc kubenswrapper[4717]: I0217 15:08:49.307675 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fp8nh" Feb 17 15:08:49 crc kubenswrapper[4717]: I0217 15:08:49.339711 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-5rhm9" Feb 17 15:08:49 crc kubenswrapper[4717]: I0217 15:08:49.385964 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-5ddqf" Feb 17 15:08:49 crc kubenswrapper[4717]: I0217 15:08:49.427256 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-qdg4j" Feb 17 15:08:49 crc kubenswrapper[4717]: I0217 15:08:49.603243 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-9kp5d" Feb 17 15:08:49 crc kubenswrapper[4717]: I0217 15:08:49.658455 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-s6xjq" Feb 17 15:08:49 crc kubenswrapper[4717]: I0217 15:08:49.745656 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-lcth2" Feb 17 15:08:49 crc kubenswrapper[4717]: I0217 15:08:49.866940 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4" Feb 17 15:08:49 crc kubenswrapper[4717]: I0217 15:08:49.870767 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mspl4" Feb 17 15:08:49 crc kubenswrapper[4717]: I0217 15:08:49.913018 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dtxn5" Feb 17 15:08:49 crc kubenswrapper[4717]: I0217 15:08:49.970577 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-x6nmz" Feb 17 15:08:50 crc kubenswrapper[4717]: I0217 15:08:50.015485 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-j9tsj" Feb 17 15:08:50 crc kubenswrapper[4717]: I0217 15:08:50.059601 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hzzpc" Feb 17 15:08:50 crc kubenswrapper[4717]: I0217 15:08:50.099851 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-22tfq" Feb 17 15:08:50 crc kubenswrapper[4717]: I0217 15:08:50.106104 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-spbft" Feb 17 15:08:50 crc kubenswrapper[4717]: I0217 15:08:50.136941 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-zlkwq" Feb 17 15:08:50 crc kubenswrapper[4717]: I0217 15:08:50.342880 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-jq8zs" Feb 17 15:08:50 crc kubenswrapper[4717]: I0217 15:08:50.808216 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:08:50 crc kubenswrapper[4717]: I0217 15:08:50.808638 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:08:51 crc kubenswrapper[4717]: I0217 15:08:51.822330 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cm8796" Feb 17 15:08:51 crc kubenswrapper[4717]: I0217 15:08:51.950300 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7d87bc949d-9mr4h" Feb 17 15:08:59 crc kubenswrapper[4717]: I0217 15:08:59.540794 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-5d768" Feb 17 15:08:59 crc kubenswrapper[4717]: I0217 15:08:59.778387 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gmfn2" Feb 17 15:09:02 crc kubenswrapper[4717]: I0217 15:09:02.706883 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6" event={"ID":"6edcc364-656f-4f2d-aa9a-3409b3b58471","Type":"ContainerStarted","Data":"0e929db0d4c578698bdabe65f63c6e9e1d94e6fb439ac6a1e482774f137b531c"} Feb 17 15:09:02 crc kubenswrapper[4717]: I0217 15:09:02.708110 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6" Feb 17 15:09:02 crc kubenswrapper[4717]: I0217 15:09:02.726224 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6" podStartSLOduration=2.481627763 podStartE2EDuration="53.726196227s" podCreationTimestamp="2026-02-17 15:08:09 +0000 UTC" firstStartedPulling="2026-02-17 15:08:10.918742001 +0000 UTC m=+957.334582477" lastFinishedPulling="2026-02-17 15:09:02.163310465 +0000 UTC m=+1008.579150941" observedRunningTime="2026-02-17 15:09:02.72210444 +0000 UTC m=+1009.137944936" watchObservedRunningTime="2026-02-17 15:09:02.726196227 +0000 UTC m=+1009.142036703" Feb 17 15:09:09 crc kubenswrapper[4717]: I0217 15:09:09.838012 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-d7sx6" Feb 17 15:09:20 crc kubenswrapper[4717]: I0217 15:09:20.808488 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:09:20 crc kubenswrapper[4717]: I0217 15:09:20.809421 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:09:20 crc kubenswrapper[4717]: I0217 15:09:20.877278 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-78855"] Feb 17 15:09:20 crc kubenswrapper[4717]: E0217 15:09:20.878559 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106d09b9-99cf-4500-88db-c6e8121491b6" containerName="extract-content" Feb 17 15:09:20 crc kubenswrapper[4717]: I0217 15:09:20.878637 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="106d09b9-99cf-4500-88db-c6e8121491b6" containerName="extract-content" Feb 17 15:09:20 crc kubenswrapper[4717]: E0217 15:09:20.878708 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106d09b9-99cf-4500-88db-c6e8121491b6" containerName="registry-server" Feb 17 15:09:20 crc kubenswrapper[4717]: I0217 15:09:20.878724 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="106d09b9-99cf-4500-88db-c6e8121491b6" containerName="registry-server" Feb 17 15:09:20 crc kubenswrapper[4717]: E0217 15:09:20.878751 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106d09b9-99cf-4500-88db-c6e8121491b6" containerName="extract-utilities" Feb 17 15:09:20 crc kubenswrapper[4717]: I0217 15:09:20.878800 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="106d09b9-99cf-4500-88db-c6e8121491b6" containerName="extract-utilities" Feb 17 15:09:20 crc kubenswrapper[4717]: I0217 15:09:20.879315 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="106d09b9-99cf-4500-88db-c6e8121491b6" containerName="registry-server" Feb 17 15:09:20 crc kubenswrapper[4717]: I0217 15:09:20.889202 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:20 crc kubenswrapper[4717]: I0217 15:09:20.915197 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-78855"] Feb 17 15:09:21 crc kubenswrapper[4717]: I0217 15:09:21.014239 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzsq4\" (UniqueName: \"kubernetes.io/projected/cef2f13d-d544-4b06-82df-7c485a318037-kube-api-access-nzsq4\") pod \"certified-operators-78855\" (UID: \"cef2f13d-d544-4b06-82df-7c485a318037\") " pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:21 crc kubenswrapper[4717]: I0217 15:09:21.014322 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef2f13d-d544-4b06-82df-7c485a318037-catalog-content\") pod \"certified-operators-78855\" (UID: \"cef2f13d-d544-4b06-82df-7c485a318037\") " pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:21 crc kubenswrapper[4717]: I0217 15:09:21.014394 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef2f13d-d544-4b06-82df-7c485a318037-utilities\") pod \"certified-operators-78855\" (UID: \"cef2f13d-d544-4b06-82df-7c485a318037\") " pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:21 crc kubenswrapper[4717]: I0217 15:09:21.115591 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzsq4\" (UniqueName: \"kubernetes.io/projected/cef2f13d-d544-4b06-82df-7c485a318037-kube-api-access-nzsq4\") pod \"certified-operators-78855\" (UID: \"cef2f13d-d544-4b06-82df-7c485a318037\") " pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:21 crc kubenswrapper[4717]: I0217 15:09:21.116142 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef2f13d-d544-4b06-82df-7c485a318037-catalog-content\") pod \"certified-operators-78855\" (UID: \"cef2f13d-d544-4b06-82df-7c485a318037\") " pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:21 crc kubenswrapper[4717]: I0217 15:09:21.116178 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef2f13d-d544-4b06-82df-7c485a318037-utilities\") pod \"certified-operators-78855\" (UID: \"cef2f13d-d544-4b06-82df-7c485a318037\") " pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:21 crc kubenswrapper[4717]: I0217 15:09:21.116960 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef2f13d-d544-4b06-82df-7c485a318037-catalog-content\") pod \"certified-operators-78855\" (UID: \"cef2f13d-d544-4b06-82df-7c485a318037\") " pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:21 crc kubenswrapper[4717]: I0217 15:09:21.116970 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef2f13d-d544-4b06-82df-7c485a318037-utilities\") pod \"certified-operators-78855\" (UID: \"cef2f13d-d544-4b06-82df-7c485a318037\") " pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:21 crc kubenswrapper[4717]: I0217 15:09:21.143816 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzsq4\" (UniqueName: \"kubernetes.io/projected/cef2f13d-d544-4b06-82df-7c485a318037-kube-api-access-nzsq4\") pod \"certified-operators-78855\" (UID: \"cef2f13d-d544-4b06-82df-7c485a318037\") " pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:21 crc kubenswrapper[4717]: I0217 15:09:21.256220 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:21 crc kubenswrapper[4717]: I0217 15:09:21.760312 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-78855"] Feb 17 15:09:21 crc kubenswrapper[4717]: I0217 15:09:21.894371 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78855" event={"ID":"cef2f13d-d544-4b06-82df-7c485a318037","Type":"ContainerStarted","Data":"6abb5242b39dfa024b98cf56d432e3767b0f84f874651c9739887bf33c2af39c"} Feb 17 15:09:22 crc kubenswrapper[4717]: I0217 15:09:22.903454 4717 generic.go:334] "Generic (PLEG): container finished" podID="cef2f13d-d544-4b06-82df-7c485a318037" containerID="3d77d2b1625f12cc883922b78072847bd772f3918f9612764f90656bec71082b" exitCode=0 Feb 17 15:09:22 crc kubenswrapper[4717]: I0217 15:09:22.903717 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78855" event={"ID":"cef2f13d-d544-4b06-82df-7c485a318037","Type":"ContainerDied","Data":"3d77d2b1625f12cc883922b78072847bd772f3918f9612764f90656bec71082b"} Feb 17 15:09:22 crc kubenswrapper[4717]: I0217 15:09:22.905752 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:09:23 crc kubenswrapper[4717]: I0217 15:09:23.913981 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78855" event={"ID":"cef2f13d-d544-4b06-82df-7c485a318037","Type":"ContainerStarted","Data":"361f4b86fb4b7008224e7c01e46e332e24b73a6c0b4fe530d41f2b094ee82d3a"} Feb 17 15:09:24 crc kubenswrapper[4717]: I0217 15:09:24.924352 4717 generic.go:334] "Generic (PLEG): container finished" podID="cef2f13d-d544-4b06-82df-7c485a318037" containerID="361f4b86fb4b7008224e7c01e46e332e24b73a6c0b4fe530d41f2b094ee82d3a" exitCode=0 Feb 17 15:09:24 crc kubenswrapper[4717]: I0217 15:09:24.924412 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78855" event={"ID":"cef2f13d-d544-4b06-82df-7c485a318037","Type":"ContainerDied","Data":"361f4b86fb4b7008224e7c01e46e332e24b73a6c0b4fe530d41f2b094ee82d3a"} Feb 17 15:09:25 crc kubenswrapper[4717]: I0217 15:09:25.933840 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78855" event={"ID":"cef2f13d-d544-4b06-82df-7c485a318037","Type":"ContainerStarted","Data":"05df5560daef0bb12f9e253b8f576459cf9a777b31cf7822e1610a254cb7ae4a"} Feb 17 15:09:25 crc kubenswrapper[4717]: I0217 15:09:25.962508 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-78855" podStartSLOduration=3.549008585 podStartE2EDuration="5.962486702s" podCreationTimestamp="2026-02-17 15:09:20 +0000 UTC" firstStartedPulling="2026-02-17 15:09:22.905470814 +0000 UTC m=+1029.321311300" lastFinishedPulling="2026-02-17 15:09:25.318948931 +0000 UTC m=+1031.734789417" observedRunningTime="2026-02-17 15:09:25.956683956 +0000 UTC m=+1032.372524472" watchObservedRunningTime="2026-02-17 15:09:25.962486702 +0000 UTC m=+1032.378327188" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.655548 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bczjv"] Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.657068 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.659585 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.659795 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.659754 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.660224 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-nst7t" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.671383 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bczjv"] Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.729793 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj9rk\" (UniqueName: \"kubernetes.io/projected/d5dd2394-2d20-49b9-bd73-377d1376aa73-kube-api-access-pj9rk\") pod \"dnsmasq-dns-675f4bcbfc-bczjv\" (UID: \"d5dd2394-2d20-49b9-bd73-377d1376aa73\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.729856 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dd2394-2d20-49b9-bd73-377d1376aa73-config\") pod \"dnsmasq-dns-675f4bcbfc-bczjv\" (UID: \"d5dd2394-2d20-49b9-bd73-377d1376aa73\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.735514 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s546s"] Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.736738 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.739538 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.753409 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s546s"] Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.830931 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39743706-11cd-45b4-ac36-2f4db1e61723-config\") pod \"dnsmasq-dns-78dd6ddcc-s546s\" (UID: \"39743706-11cd-45b4-ac36-2f4db1e61723\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.831001 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5kth\" (UniqueName: \"kubernetes.io/projected/39743706-11cd-45b4-ac36-2f4db1e61723-kube-api-access-f5kth\") pod \"dnsmasq-dns-78dd6ddcc-s546s\" (UID: \"39743706-11cd-45b4-ac36-2f4db1e61723\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.831175 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39743706-11cd-45b4-ac36-2f4db1e61723-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-s546s\" (UID: \"39743706-11cd-45b4-ac36-2f4db1e61723\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.831575 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj9rk\" (UniqueName: \"kubernetes.io/projected/d5dd2394-2d20-49b9-bd73-377d1376aa73-kube-api-access-pj9rk\") pod \"dnsmasq-dns-675f4bcbfc-bczjv\" (UID: \"d5dd2394-2d20-49b9-bd73-377d1376aa73\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.831819 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dd2394-2d20-49b9-bd73-377d1376aa73-config\") pod \"dnsmasq-dns-675f4bcbfc-bczjv\" (UID: \"d5dd2394-2d20-49b9-bd73-377d1376aa73\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.833034 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dd2394-2d20-49b9-bd73-377d1376aa73-config\") pod \"dnsmasq-dns-675f4bcbfc-bczjv\" (UID: \"d5dd2394-2d20-49b9-bd73-377d1376aa73\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.860440 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj9rk\" (UniqueName: \"kubernetes.io/projected/d5dd2394-2d20-49b9-bd73-377d1376aa73-kube-api-access-pj9rk\") pod \"dnsmasq-dns-675f4bcbfc-bczjv\" (UID: \"d5dd2394-2d20-49b9-bd73-377d1376aa73\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.932905 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39743706-11cd-45b4-ac36-2f4db1e61723-config\") pod \"dnsmasq-dns-78dd6ddcc-s546s\" (UID: \"39743706-11cd-45b4-ac36-2f4db1e61723\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.932953 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5kth\" (UniqueName: \"kubernetes.io/projected/39743706-11cd-45b4-ac36-2f4db1e61723-kube-api-access-f5kth\") pod \"dnsmasq-dns-78dd6ddcc-s546s\" (UID: \"39743706-11cd-45b4-ac36-2f4db1e61723\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.932981 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39743706-11cd-45b4-ac36-2f4db1e61723-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-s546s\" (UID: \"39743706-11cd-45b4-ac36-2f4db1e61723\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.933769 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39743706-11cd-45b4-ac36-2f4db1e61723-config\") pod \"dnsmasq-dns-78dd6ddcc-s546s\" (UID: \"39743706-11cd-45b4-ac36-2f4db1e61723\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.933861 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39743706-11cd-45b4-ac36-2f4db1e61723-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-s546s\" (UID: \"39743706-11cd-45b4-ac36-2f4db1e61723\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.953959 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5kth\" (UniqueName: \"kubernetes.io/projected/39743706-11cd-45b4-ac36-2f4db1e61723-kube-api-access-f5kth\") pod \"dnsmasq-dns-78dd6ddcc-s546s\" (UID: \"39743706-11cd-45b4-ac36-2f4db1e61723\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:26 crc kubenswrapper[4717]: I0217 15:09:26.977249 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" Feb 17 15:09:27 crc kubenswrapper[4717]: I0217 15:09:27.054485 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:27 crc kubenswrapper[4717]: I0217 15:09:27.477750 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bczjv"] Feb 17 15:09:27 crc kubenswrapper[4717]: W0217 15:09:27.489137 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5dd2394_2d20_49b9_bd73_377d1376aa73.slice/crio-f0cc6f0bd4c07271510a1f84d5fbdae66be18b2171f753e02b82852344daf066 WatchSource:0}: Error finding container f0cc6f0bd4c07271510a1f84d5fbdae66be18b2171f753e02b82852344daf066: Status 404 returned error can't find the container with id f0cc6f0bd4c07271510a1f84d5fbdae66be18b2171f753e02b82852344daf066 Feb 17 15:09:27 crc kubenswrapper[4717]: I0217 15:09:27.542102 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s546s"] Feb 17 15:09:27 crc kubenswrapper[4717]: W0217 15:09:27.547836 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39743706_11cd_45b4_ac36_2f4db1e61723.slice/crio-0e053d85a29119ebc1ad090a601395a503176f3f9f9261fe6bdfe99a6b40f773 WatchSource:0}: Error finding container 0e053d85a29119ebc1ad090a601395a503176f3f9f9261fe6bdfe99a6b40f773: Status 404 returned error can't find the container with id 0e053d85a29119ebc1ad090a601395a503176f3f9f9261fe6bdfe99a6b40f773 Feb 17 15:09:27 crc kubenswrapper[4717]: I0217 15:09:27.953468 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" event={"ID":"d5dd2394-2d20-49b9-bd73-377d1376aa73","Type":"ContainerStarted","Data":"f0cc6f0bd4c07271510a1f84d5fbdae66be18b2171f753e02b82852344daf066"} Feb 17 15:09:27 crc kubenswrapper[4717]: I0217 15:09:27.955031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" event={"ID":"39743706-11cd-45b4-ac36-2f4db1e61723","Type":"ContainerStarted","Data":"0e053d85a29119ebc1ad090a601395a503176f3f9f9261fe6bdfe99a6b40f773"} Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.346971 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bczjv"] Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.375854 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-crgsg"] Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.377233 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.395887 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-crgsg"] Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.475121 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-config\") pod \"dnsmasq-dns-666b6646f7-crgsg\" (UID: \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\") " pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.475213 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-dns-svc\") pod \"dnsmasq-dns-666b6646f7-crgsg\" (UID: \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\") " pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.475303 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btdpj\" (UniqueName: \"kubernetes.io/projected/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-kube-api-access-btdpj\") pod \"dnsmasq-dns-666b6646f7-crgsg\" (UID: \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\") " pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.577305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btdpj\" (UniqueName: \"kubernetes.io/projected/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-kube-api-access-btdpj\") pod \"dnsmasq-dns-666b6646f7-crgsg\" (UID: \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\") " pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.577357 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-config\") pod \"dnsmasq-dns-666b6646f7-crgsg\" (UID: \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\") " pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.577416 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-dns-svc\") pod \"dnsmasq-dns-666b6646f7-crgsg\" (UID: \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\") " pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.578300 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-dns-svc\") pod \"dnsmasq-dns-666b6646f7-crgsg\" (UID: \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\") " pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.578956 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-config\") pod \"dnsmasq-dns-666b6646f7-crgsg\" (UID: \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\") " pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.630613 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btdpj\" (UniqueName: \"kubernetes.io/projected/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-kube-api-access-btdpj\") pod \"dnsmasq-dns-666b6646f7-crgsg\" (UID: \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\") " pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.650452 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s546s"] Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.682888 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xhwc2"] Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.684127 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.702867 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.711616 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xhwc2"] Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.789615 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d89b253-1e16-4f06-a413-db01a9af4574-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xhwc2\" (UID: \"7d89b253-1e16-4f06-a413-db01a9af4574\") " pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.790166 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6nn\" (UniqueName: \"kubernetes.io/projected/7d89b253-1e16-4f06-a413-db01a9af4574-kube-api-access-2k6nn\") pod \"dnsmasq-dns-57d769cc4f-xhwc2\" (UID: \"7d89b253-1e16-4f06-a413-db01a9af4574\") " pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.790276 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d89b253-1e16-4f06-a413-db01a9af4574-config\") pod \"dnsmasq-dns-57d769cc4f-xhwc2\" (UID: \"7d89b253-1e16-4f06-a413-db01a9af4574\") " pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.893861 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d89b253-1e16-4f06-a413-db01a9af4574-config\") pod \"dnsmasq-dns-57d769cc4f-xhwc2\" (UID: \"7d89b253-1e16-4f06-a413-db01a9af4574\") " pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.894808 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d89b253-1e16-4f06-a413-db01a9af4574-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xhwc2\" (UID: \"7d89b253-1e16-4f06-a413-db01a9af4574\") " pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.894872 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6nn\" (UniqueName: \"kubernetes.io/projected/7d89b253-1e16-4f06-a413-db01a9af4574-kube-api-access-2k6nn\") pod \"dnsmasq-dns-57d769cc4f-xhwc2\" (UID: \"7d89b253-1e16-4f06-a413-db01a9af4574\") " pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.894755 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d89b253-1e16-4f06-a413-db01a9af4574-config\") pod \"dnsmasq-dns-57d769cc4f-xhwc2\" (UID: \"7d89b253-1e16-4f06-a413-db01a9af4574\") " pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.895511 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d89b253-1e16-4f06-a413-db01a9af4574-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xhwc2\" (UID: \"7d89b253-1e16-4f06-a413-db01a9af4574\") " pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:09:29 crc kubenswrapper[4717]: I0217 15:09:29.927765 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6nn\" (UniqueName: \"kubernetes.io/projected/7d89b253-1e16-4f06-a413-db01a9af4574-kube-api-access-2k6nn\") pod \"dnsmasq-dns-57d769cc4f-xhwc2\" (UID: \"7d89b253-1e16-4f06-a413-db01a9af4574\") " pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.010997 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.402673 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-crgsg"] Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.506363 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xhwc2"] Feb 17 15:09:30 crc kubenswrapper[4717]: W0217 15:09:30.520681 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d89b253_1e16_4f06_a413_db01a9af4574.slice/crio-6fa49c0a2216287a85f505754556409787b3ddb7ed345ababd475ddcdd5ec848 WatchSource:0}: Error finding container 6fa49c0a2216287a85f505754556409787b3ddb7ed345ababd475ddcdd5ec848: Status 404 returned error can't find the container with id 6fa49c0a2216287a85f505754556409787b3ddb7ed345ababd475ddcdd5ec848 Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.527399 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.528815 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.536908 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.537338 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.537617 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.537731 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8gt4v" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.537770 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.537779 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.538251 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.560850 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.609248 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.609305 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.609337 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8924cebf-3c79-4978-9564-ec8869b9d79a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.609493 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8924cebf-3c79-4978-9564-ec8869b9d79a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.609546 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.609676 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-config-data\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.609809 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqxbk\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-kube-api-access-mqxbk\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.609888 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.609929 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.610025 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.610123 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.712161 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8924cebf-3c79-4978-9564-ec8869b9d79a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.712232 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.712269 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-config-data\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.712309 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqxbk\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-kube-api-access-mqxbk\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.712347 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.712373 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.712413 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.712451 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.712472 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.712493 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.712510 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8924cebf-3c79-4978-9564-ec8869b9d79a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.713593 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.713994 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.714304 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.714969 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-config-data\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.716873 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.719437 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.726819 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8924cebf-3c79-4978-9564-ec8869b9d79a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.729630 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.739939 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.740368 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8924cebf-3c79-4978-9564-ec8869b9d79a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.741702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.744262 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqxbk\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-kube-api-access-mqxbk\") pod \"rabbitmq-server-0\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.832712 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.834860 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.839892 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.840190 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.840436 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.840527 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.840669 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.840783 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-kwnlh" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.841718 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.845398 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.896591 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.915423 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0eb38f44-bed1-4e65-8de2-9624715baee1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.915469 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.915488 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.915524 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.915567 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8jr7\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-kube-api-access-m8jr7\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.915587 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.915602 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.915636 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.915698 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.915735 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.915753 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0eb38f44-bed1-4e65-8de2-9624715baee1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.994543 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-crgsg" event={"ID":"fb0eb1e2-46e0-4783-bdb4-920c8501a97a","Type":"ContainerStarted","Data":"0519f37e952d1de6fc06c96a0b14adeca5cc30b6994e398245e352c27d687092"} Feb 17 15:09:30 crc kubenswrapper[4717]: I0217 15:09:30.996852 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" event={"ID":"7d89b253-1e16-4f06-a413-db01a9af4574","Type":"ContainerStarted","Data":"6fa49c0a2216287a85f505754556409787b3ddb7ed345ababd475ddcdd5ec848"} Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.020227 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.020594 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.020632 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.020651 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0eb38f44-bed1-4e65-8de2-9624715baee1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.020693 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0eb38f44-bed1-4e65-8de2-9624715baee1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.020720 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.020737 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.020773 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.020816 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8jr7\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-kube-api-access-m8jr7\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.020845 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.020863 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.023937 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.024285 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.025392 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.025660 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.026664 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.027197 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.030462 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.041358 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0eb38f44-bed1-4e65-8de2-9624715baee1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.045893 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8jr7\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-kube-api-access-m8jr7\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.051243 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.053340 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0eb38f44-bed1-4e65-8de2-9624715baee1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.095877 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.257000 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.257060 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.266109 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.328015 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.495712 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 15:09:31 crc kubenswrapper[4717]: W0217 15:09:31.516550 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8924cebf_3c79_4978_9564_ec8869b9d79a.slice/crio-2ce4d391f32e4899023fcfbc3cf6e3c054c02f8462cdda8e71fa0e7a1c96ad89 WatchSource:0}: Error finding container 2ce4d391f32e4899023fcfbc3cf6e3c054c02f8462cdda8e71fa0e7a1c96ad89: Status 404 returned error can't find the container with id 2ce4d391f32e4899023fcfbc3cf6e3c054c02f8462cdda8e71fa0e7a1c96ad89 Feb 17 15:09:31 crc kubenswrapper[4717]: I0217 15:09:31.752499 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 15:09:31 crc kubenswrapper[4717]: W0217 15:09:31.780337 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eb38f44_bed1_4e65_8de2_9624715baee1.slice/crio-5a4ec674f10416720d1b003586ad4b91e335572a2782a4c2ced47e3bae754bca WatchSource:0}: Error finding container 5a4ec674f10416720d1b003586ad4b91e335572a2782a4c2ced47e3bae754bca: Status 404 returned error can't find the container with id 5a4ec674f10416720d1b003586ad4b91e335572a2782a4c2ced47e3bae754bca Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.008037 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0eb38f44-bed1-4e65-8de2-9624715baee1","Type":"ContainerStarted","Data":"5a4ec674f10416720d1b003586ad4b91e335572a2782a4c2ced47e3bae754bca"} Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.011093 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8924cebf-3c79-4978-9564-ec8869b9d79a","Type":"ContainerStarted","Data":"2ce4d391f32e4899023fcfbc3cf6e3c054c02f8462cdda8e71fa0e7a1c96ad89"} Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.060611 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.062861 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.066213 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-tmbcn" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.066496 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.068101 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.068265 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.068958 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.076671 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.114985 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.158367 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec72b87-16f5-487e-ae08-a52b5d289bee-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.158476 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ec72b87-16f5-487e-ae08-a52b5d289bee-kolla-config\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.159119 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8ec72b87-16f5-487e-ae08-a52b5d289bee-config-data-default\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.159361 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec72b87-16f5-487e-ae08-a52b5d289bee-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.159416 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bhbd\" (UniqueName: \"kubernetes.io/projected/8ec72b87-16f5-487e-ae08-a52b5d289bee-kube-api-access-2bhbd\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.159506 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec72b87-16f5-487e-ae08-a52b5d289bee-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.159636 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8ec72b87-16f5-487e-ae08-a52b5d289bee-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.159729 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.217667 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-78855"] Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.261084 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8ec72b87-16f5-487e-ae08-a52b5d289bee-config-data-default\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.261175 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec72b87-16f5-487e-ae08-a52b5d289bee-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.261201 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bhbd\" (UniqueName: \"kubernetes.io/projected/8ec72b87-16f5-487e-ae08-a52b5d289bee-kube-api-access-2bhbd\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.261225 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec72b87-16f5-487e-ae08-a52b5d289bee-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.261257 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8ec72b87-16f5-487e-ae08-a52b5d289bee-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.261299 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.261326 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec72b87-16f5-487e-ae08-a52b5d289bee-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.261347 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ec72b87-16f5-487e-ae08-a52b5d289bee-kolla-config\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.262743 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8ec72b87-16f5-487e-ae08-a52b5d289bee-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.263102 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ec72b87-16f5-487e-ae08-a52b5d289bee-kolla-config\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.263229 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.263542 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ec72b87-16f5-487e-ae08-a52b5d289bee-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.264931 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8ec72b87-16f5-487e-ae08-a52b5d289bee-config-data-default\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.273372 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec72b87-16f5-487e-ae08-a52b5d289bee-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.277686 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec72b87-16f5-487e-ae08-a52b5d289bee-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.305070 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bhbd\" (UniqueName: \"kubernetes.io/projected/8ec72b87-16f5-487e-ae08-a52b5d289bee-kube-api-access-2bhbd\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.340157 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"8ec72b87-16f5-487e-ae08-a52b5d289bee\") " pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.450907 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 15:09:32 crc kubenswrapper[4717]: I0217 15:09:32.957908 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.086342 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8ec72b87-16f5-487e-ae08-a52b5d289bee","Type":"ContainerStarted","Data":"dc60332ca5e5fda1b25183185af91e421d75cf27564547eefee9d958bd7753cd"} Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.433789 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.444524 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.444647 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.450191 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-k7s2v" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.450388 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.450472 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.450576 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.494510 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b7d39d-8183-4e96-a163-b72323ccb0b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.494556 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.494580 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b7d39d-8183-4e96-a163-b72323ccb0b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.494598 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/14b7d39d-8183-4e96-a163-b72323ccb0b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.494637 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14b7d39d-8183-4e96-a163-b72323ccb0b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.494668 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/14b7d39d-8183-4e96-a163-b72323ccb0b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.494690 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddqxl\" (UniqueName: \"kubernetes.io/projected/14b7d39d-8183-4e96-a163-b72323ccb0b5-kube-api-access-ddqxl\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.494725 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/14b7d39d-8183-4e96-a163-b72323ccb0b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.518193 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.519538 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.524772 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.525010 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-6zq7h" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.525054 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.531253 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.597155 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-config-data\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.597217 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rct5\" (UniqueName: \"kubernetes.io/projected/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-kube-api-access-4rct5\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.597260 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b7d39d-8183-4e96-a163-b72323ccb0b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.597459 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.597611 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/14b7d39d-8183-4e96-a163-b72323ccb0b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.597678 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b7d39d-8183-4e96-a163-b72323ccb0b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.597699 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.597732 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-kolla-config\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.597765 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14b7d39d-8183-4e96-a163-b72323ccb0b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.597769 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.597822 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/14b7d39d-8183-4e96-a163-b72323ccb0b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.597842 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddqxl\" (UniqueName: \"kubernetes.io/projected/14b7d39d-8183-4e96-a163-b72323ccb0b5-kube-api-access-ddqxl\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.598038 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/14b7d39d-8183-4e96-a163-b72323ccb0b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.598074 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.598551 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/14b7d39d-8183-4e96-a163-b72323ccb0b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.599414 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/14b7d39d-8183-4e96-a163-b72323ccb0b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.599803 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/14b7d39d-8183-4e96-a163-b72323ccb0b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.600334 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14b7d39d-8183-4e96-a163-b72323ccb0b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.616056 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b7d39d-8183-4e96-a163-b72323ccb0b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.618780 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddqxl\" (UniqueName: \"kubernetes.io/projected/14b7d39d-8183-4e96-a163-b72323ccb0b5-kube-api-access-ddqxl\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.619950 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.634175 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b7d39d-8183-4e96-a163-b72323ccb0b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"14b7d39d-8183-4e96-a163-b72323ccb0b5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.699842 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-kolla-config\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.699972 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.700023 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-config-data\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.700039 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rct5\" (UniqueName: \"kubernetes.io/projected/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-kube-api-access-4rct5\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.700073 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.700605 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-kolla-config\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.701270 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-config-data\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.705445 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.708034 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.717567 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rct5\" (UniqueName: \"kubernetes.io/projected/0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f-kube-api-access-4rct5\") pod \"memcached-0\" (UID: \"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f\") " pod="openstack/memcached-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.786770 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 15:09:33 crc kubenswrapper[4717]: I0217 15:09:33.842099 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 15:09:34 crc kubenswrapper[4717]: I0217 15:09:34.097471 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-78855" podUID="cef2f13d-d544-4b06-82df-7c485a318037" containerName="registry-server" containerID="cri-o://05df5560daef0bb12f9e253b8f576459cf9a777b31cf7822e1610a254cb7ae4a" gracePeriod=2 Feb 17 15:09:36 crc kubenswrapper[4717]: I0217 15:09:36.072900 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 15:09:36 crc kubenswrapper[4717]: I0217 15:09:36.081932 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 15:09:36 crc kubenswrapper[4717]: I0217 15:09:36.087841 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-p6nln" Feb 17 15:09:36 crc kubenswrapper[4717]: I0217 15:09:36.096181 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 15:09:36 crc kubenswrapper[4717]: I0217 15:09:36.146553 4717 generic.go:334] "Generic (PLEG): container finished" podID="cef2f13d-d544-4b06-82df-7c485a318037" containerID="05df5560daef0bb12f9e253b8f576459cf9a777b31cf7822e1610a254cb7ae4a" exitCode=0 Feb 17 15:09:36 crc kubenswrapper[4717]: I0217 15:09:36.146607 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78855" event={"ID":"cef2f13d-d544-4b06-82df-7c485a318037","Type":"ContainerDied","Data":"05df5560daef0bb12f9e253b8f576459cf9a777b31cf7822e1610a254cb7ae4a"} Feb 17 15:09:36 crc kubenswrapper[4717]: I0217 15:09:36.149884 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8j2j\" (UniqueName: \"kubernetes.io/projected/e905bd78-7554-4636-b508-a2a67078018e-kube-api-access-r8j2j\") pod \"kube-state-metrics-0\" (UID: \"e905bd78-7554-4636-b508-a2a67078018e\") " pod="openstack/kube-state-metrics-0" Feb 17 15:09:36 crc kubenswrapper[4717]: I0217 15:09:36.252497 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8j2j\" (UniqueName: \"kubernetes.io/projected/e905bd78-7554-4636-b508-a2a67078018e-kube-api-access-r8j2j\") pod \"kube-state-metrics-0\" (UID: \"e905bd78-7554-4636-b508-a2a67078018e\") " pod="openstack/kube-state-metrics-0" Feb 17 15:09:36 crc kubenswrapper[4717]: I0217 15:09:36.274205 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8j2j\" (UniqueName: \"kubernetes.io/projected/e905bd78-7554-4636-b508-a2a67078018e-kube-api-access-r8j2j\") pod \"kube-state-metrics-0\" (UID: \"e905bd78-7554-4636-b508-a2a67078018e\") " pod="openstack/kube-state-metrics-0" Feb 17 15:09:36 crc kubenswrapper[4717]: I0217 15:09:36.401950 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.454106 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.456344 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.461557 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.462010 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kkslt" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.464353 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.464584 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.464747 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.490049 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.536511 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.536599 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a71caa4-7d53-466e-8d74-98c814d3afda-config\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.536666 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a71caa4-7d53-466e-8d74-98c814d3afda-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.536710 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a71caa4-7d53-466e-8d74-98c814d3afda-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.536766 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a71caa4-7d53-466e-8d74-98c814d3afda-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.536798 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a71caa4-7d53-466e-8d74-98c814d3afda-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.536835 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a71caa4-7d53-466e-8d74-98c814d3afda-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.536876 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mc47\" (UniqueName: \"kubernetes.io/projected/7a71caa4-7d53-466e-8d74-98c814d3afda-kube-api-access-4mc47\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.638433 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a71caa4-7d53-466e-8d74-98c814d3afda-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.638869 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a71caa4-7d53-466e-8d74-98c814d3afda-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.640145 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mc47\" (UniqueName: \"kubernetes.io/projected/7a71caa4-7d53-466e-8d74-98c814d3afda-kube-api-access-4mc47\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.640339 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.640432 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a71caa4-7d53-466e-8d74-98c814d3afda-config\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.639110 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a71caa4-7d53-466e-8d74-98c814d3afda-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.640711 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a71caa4-7d53-466e-8d74-98c814d3afda-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.640821 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a71caa4-7d53-466e-8d74-98c814d3afda-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.640922 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a71caa4-7d53-466e-8d74-98c814d3afda-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.640935 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.643492 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a71caa4-7d53-466e-8d74-98c814d3afda-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.644000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a71caa4-7d53-466e-8d74-98c814d3afda-config\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.652875 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a71caa4-7d53-466e-8d74-98c814d3afda-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.660798 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a71caa4-7d53-466e-8d74-98c814d3afda-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.665846 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a71caa4-7d53-466e-8d74-98c814d3afda-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.684064 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mc47\" (UniqueName: \"kubernetes.io/projected/7a71caa4-7d53-466e-8d74-98c814d3afda-kube-api-access-4mc47\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.699735 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7a71caa4-7d53-466e-8d74-98c814d3afda\") " pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.782066 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.881291 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-k4zc7"] Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.882605 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.884662 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-24zbj" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.885763 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.885986 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.891765 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5r2q4"] Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.893673 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.903171 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k4zc7"] Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.913495 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5r2q4"] Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.946496 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/edd6eb53-55b7-4a61-867c-e4bf277af963-var-run-ovn\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.946561 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mfkf\" (UniqueName: \"kubernetes.io/projected/edd6eb53-55b7-4a61-867c-e4bf277af963-kube-api-access-7mfkf\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.946596 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l74c9\" (UniqueName: \"kubernetes.io/projected/1954ed93-1aa6-4c08-8379-01d047f5da20-kube-api-access-l74c9\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.946727 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd6eb53-55b7-4a61-867c-e4bf277af963-scripts\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.946839 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1954ed93-1aa6-4c08-8379-01d047f5da20-var-run\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.947243 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1954ed93-1aa6-4c08-8379-01d047f5da20-etc-ovs\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.947658 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1954ed93-1aa6-4c08-8379-01d047f5da20-scripts\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.947701 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1954ed93-1aa6-4c08-8379-01d047f5da20-var-lib\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.947748 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edd6eb53-55b7-4a61-867c-e4bf277af963-var-run\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.947794 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd6eb53-55b7-4a61-867c-e4bf277af963-combined-ca-bundle\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.947841 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/edd6eb53-55b7-4a61-867c-e4bf277af963-var-log-ovn\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.947908 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1954ed93-1aa6-4c08-8379-01d047f5da20-var-log\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:39 crc kubenswrapper[4717]: I0217 15:09:39.947976 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/edd6eb53-55b7-4a61-867c-e4bf277af963-ovn-controller-tls-certs\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.049948 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1954ed93-1aa6-4c08-8379-01d047f5da20-var-run\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050025 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1954ed93-1aa6-4c08-8379-01d047f5da20-etc-ovs\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050052 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1954ed93-1aa6-4c08-8379-01d047f5da20-scripts\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050101 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1954ed93-1aa6-4c08-8379-01d047f5da20-var-lib\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050126 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edd6eb53-55b7-4a61-867c-e4bf277af963-var-run\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050193 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd6eb53-55b7-4a61-867c-e4bf277af963-combined-ca-bundle\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050222 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/edd6eb53-55b7-4a61-867c-e4bf277af963-var-log-ovn\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050255 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1954ed93-1aa6-4c08-8379-01d047f5da20-var-log\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050286 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/edd6eb53-55b7-4a61-867c-e4bf277af963-ovn-controller-tls-certs\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050319 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/edd6eb53-55b7-4a61-867c-e4bf277af963-var-run-ovn\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050343 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mfkf\" (UniqueName: \"kubernetes.io/projected/edd6eb53-55b7-4a61-867c-e4bf277af963-kube-api-access-7mfkf\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050367 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l74c9\" (UniqueName: \"kubernetes.io/projected/1954ed93-1aa6-4c08-8379-01d047f5da20-kube-api-access-l74c9\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050388 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd6eb53-55b7-4a61-867c-e4bf277af963-scripts\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050737 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1954ed93-1aa6-4c08-8379-01d047f5da20-etc-ovs\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050888 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edd6eb53-55b7-4a61-867c-e4bf277af963-var-run\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.050902 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1954ed93-1aa6-4c08-8379-01d047f5da20-var-run\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.051042 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/edd6eb53-55b7-4a61-867c-e4bf277af963-var-run-ovn\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.051069 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1954ed93-1aa6-4c08-8379-01d047f5da20-var-log\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.051757 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1954ed93-1aa6-4c08-8379-01d047f5da20-var-lib\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.051858 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/edd6eb53-55b7-4a61-867c-e4bf277af963-var-log-ovn\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.052193 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1954ed93-1aa6-4c08-8379-01d047f5da20-scripts\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.052518 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd6eb53-55b7-4a61-867c-e4bf277af963-scripts\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.055139 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/edd6eb53-55b7-4a61-867c-e4bf277af963-ovn-controller-tls-certs\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.057372 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd6eb53-55b7-4a61-867c-e4bf277af963-combined-ca-bundle\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.069349 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l74c9\" (UniqueName: \"kubernetes.io/projected/1954ed93-1aa6-4c08-8379-01d047f5da20-kube-api-access-l74c9\") pod \"ovn-controller-ovs-5r2q4\" (UID: \"1954ed93-1aa6-4c08-8379-01d047f5da20\") " pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.070889 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mfkf\" (UniqueName: \"kubernetes.io/projected/edd6eb53-55b7-4a61-867c-e4bf277af963-kube-api-access-7mfkf\") pod \"ovn-controller-k4zc7\" (UID: \"edd6eb53-55b7-4a61-867c-e4bf277af963\") " pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.204748 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k4zc7" Feb 17 15:09:40 crc kubenswrapper[4717]: I0217 15:09:40.213145 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:09:41 crc kubenswrapper[4717]: E0217 15:09:41.257866 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05df5560daef0bb12f9e253b8f576459cf9a777b31cf7822e1610a254cb7ae4a is running failed: container process not found" containerID="05df5560daef0bb12f9e253b8f576459cf9a777b31cf7822e1610a254cb7ae4a" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 15:09:41 crc kubenswrapper[4717]: E0217 15:09:41.262230 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05df5560daef0bb12f9e253b8f576459cf9a777b31cf7822e1610a254cb7ae4a is running failed: container process not found" containerID="05df5560daef0bb12f9e253b8f576459cf9a777b31cf7822e1610a254cb7ae4a" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 15:09:41 crc kubenswrapper[4717]: E0217 15:09:41.265765 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05df5560daef0bb12f9e253b8f576459cf9a777b31cf7822e1610a254cb7ae4a is running failed: container process not found" containerID="05df5560daef0bb12f9e253b8f576459cf9a777b31cf7822e1610a254cb7ae4a" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 15:09:41 crc kubenswrapper[4717]: E0217 15:09:41.265815 4717 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05df5560daef0bb12f9e253b8f576459cf9a777b31cf7822e1610a254cb7ae4a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-78855" podUID="cef2f13d-d544-4b06-82df-7c485a318037" containerName="registry-server" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.100701 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.103290 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.106716 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.106977 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5g2hl" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.107116 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.107171 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.114493 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.208635 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.208702 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2c8b\" (UniqueName: \"kubernetes.io/projected/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-kube-api-access-t2c8b\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.208738 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-config\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.208761 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.208984 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.209367 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.209447 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.209821 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.312223 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.312300 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.312343 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2c8b\" (UniqueName: \"kubernetes.io/projected/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-kube-api-access-t2c8b\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.312372 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-config\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.312425 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.312486 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.312537 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.312604 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.312880 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.313688 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.314311 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.314760 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-config\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.322020 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.324247 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.329832 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.332946 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2c8b\" (UniqueName: \"kubernetes.io/projected/82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4-kube-api-access-t2c8b\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.335073 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4\") " pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:43 crc kubenswrapper[4717]: I0217 15:09:43.434330 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 15:09:44 crc kubenswrapper[4717]: I0217 15:09:44.221633 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78855" event={"ID":"cef2f13d-d544-4b06-82df-7c485a318037","Type":"ContainerDied","Data":"6abb5242b39dfa024b98cf56d432e3767b0f84f874651c9739887bf33c2af39c"} Feb 17 15:09:44 crc kubenswrapper[4717]: I0217 15:09:44.221685 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6abb5242b39dfa024b98cf56d432e3767b0f84f874651c9739887bf33c2af39c" Feb 17 15:09:44 crc kubenswrapper[4717]: I0217 15:09:44.255780 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:44 crc kubenswrapper[4717]: I0217 15:09:44.332634 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef2f13d-d544-4b06-82df-7c485a318037-catalog-content\") pod \"cef2f13d-d544-4b06-82df-7c485a318037\" (UID: \"cef2f13d-d544-4b06-82df-7c485a318037\") " Feb 17 15:09:44 crc kubenswrapper[4717]: I0217 15:09:44.333229 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzsq4\" (UniqueName: \"kubernetes.io/projected/cef2f13d-d544-4b06-82df-7c485a318037-kube-api-access-nzsq4\") pod \"cef2f13d-d544-4b06-82df-7c485a318037\" (UID: \"cef2f13d-d544-4b06-82df-7c485a318037\") " Feb 17 15:09:44 crc kubenswrapper[4717]: I0217 15:09:44.333283 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef2f13d-d544-4b06-82df-7c485a318037-utilities\") pod \"cef2f13d-d544-4b06-82df-7c485a318037\" (UID: \"cef2f13d-d544-4b06-82df-7c485a318037\") " Feb 17 15:09:44 crc kubenswrapper[4717]: I0217 15:09:44.334508 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef2f13d-d544-4b06-82df-7c485a318037-utilities" (OuterVolumeSpecName: "utilities") pod "cef2f13d-d544-4b06-82df-7c485a318037" (UID: "cef2f13d-d544-4b06-82df-7c485a318037"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:09:44 crc kubenswrapper[4717]: I0217 15:09:44.334696 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef2f13d-d544-4b06-82df-7c485a318037-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:09:44 crc kubenswrapper[4717]: I0217 15:09:44.340732 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef2f13d-d544-4b06-82df-7c485a318037-kube-api-access-nzsq4" (OuterVolumeSpecName: "kube-api-access-nzsq4") pod "cef2f13d-d544-4b06-82df-7c485a318037" (UID: "cef2f13d-d544-4b06-82df-7c485a318037"). InnerVolumeSpecName "kube-api-access-nzsq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:09:44 crc kubenswrapper[4717]: I0217 15:09:44.393331 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef2f13d-d544-4b06-82df-7c485a318037-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cef2f13d-d544-4b06-82df-7c485a318037" (UID: "cef2f13d-d544-4b06-82df-7c485a318037"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:09:44 crc kubenswrapper[4717]: I0217 15:09:44.436335 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef2f13d-d544-4b06-82df-7c485a318037-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:09:44 crc kubenswrapper[4717]: I0217 15:09:44.436924 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzsq4\" (UniqueName: \"kubernetes.io/projected/cef2f13d-d544-4b06-82df-7c485a318037-kube-api-access-nzsq4\") on node \"crc\" DevicePath \"\"" Feb 17 15:09:45 crc kubenswrapper[4717]: I0217 15:09:45.230021 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-78855" Feb 17 15:09:45 crc kubenswrapper[4717]: I0217 15:09:45.270662 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-78855"] Feb 17 15:09:45 crc kubenswrapper[4717]: I0217 15:09:45.276318 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-78855"] Feb 17 15:09:45 crc kubenswrapper[4717]: I0217 15:09:45.856312 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef2f13d-d544-4b06-82df-7c485a318037" path="/var/lib/kubelet/pods/cef2f13d-d544-4b06-82df-7c485a318037/volumes" Feb 17 15:09:47 crc kubenswrapper[4717]: E0217 15:09:47.934812 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 17 15:09:47 crc kubenswrapper[4717]: E0217 15:09:47.935946 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mqxbk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(8924cebf-3c79-4978-9564-ec8869b9d79a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:09:47 crc kubenswrapper[4717]: E0217 15:09:47.937218 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="8924cebf-3c79-4978-9564-ec8869b9d79a" Feb 17 15:09:47 crc kubenswrapper[4717]: E0217 15:09:47.969751 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 17 15:09:47 crc kubenswrapper[4717]: E0217 15:09:47.969953 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8jr7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(0eb38f44-bed1-4e65-8de2-9624715baee1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:09:47 crc kubenswrapper[4717]: E0217 15:09:47.973609 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0eb38f44-bed1-4e65-8de2-9624715baee1" Feb 17 15:09:48 crc kubenswrapper[4717]: E0217 15:09:48.257939 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="8924cebf-3c79-4978-9564-ec8869b9d79a" Feb 17 15:09:48 crc kubenswrapper[4717]: E0217 15:09:48.258917 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0eb38f44-bed1-4e65-8de2-9624715baee1" Feb 17 15:09:50 crc kubenswrapper[4717]: I0217 15:09:50.808279 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:09:50 crc kubenswrapper[4717]: I0217 15:09:50.808639 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:09:50 crc kubenswrapper[4717]: I0217 15:09:50.808707 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 15:09:50 crc kubenswrapper[4717]: I0217 15:09:50.809577 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"894debb3d49a5afafc8d152c1e296cd8509036d91968011b7ffc16cede4826fe"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:09:50 crc kubenswrapper[4717]: I0217 15:09:50.809680 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://894debb3d49a5afafc8d152c1e296cd8509036d91968011b7ffc16cede4826fe" gracePeriod=600 Feb 17 15:09:51 crc kubenswrapper[4717]: I0217 15:09:51.279052 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="894debb3d49a5afafc8d152c1e296cd8509036d91968011b7ffc16cede4826fe" exitCode=0 Feb 17 15:09:51 crc kubenswrapper[4717]: I0217 15:09:51.279111 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"894debb3d49a5afafc8d152c1e296cd8509036d91968011b7ffc16cede4826fe"} Feb 17 15:09:51 crc kubenswrapper[4717]: I0217 15:09:51.279170 4717 scope.go:117] "RemoveContainer" containerID="a6cb2d452429431b113e7eb8a0c1c4fb59dfbfa50aec6a5702cb9844fb01cb5c" Feb 17 15:09:54 crc kubenswrapper[4717]: I0217 15:09:54.117456 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 15:09:54 crc kubenswrapper[4717]: I0217 15:09:54.127252 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 15:09:54 crc kubenswrapper[4717]: I0217 15:09:54.134496 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5r2q4"] Feb 17 15:09:54 crc kubenswrapper[4717]: I0217 15:09:54.182438 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k4zc7"] Feb 17 15:09:54 crc kubenswrapper[4717]: I0217 15:09:54.189231 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 15:09:54 crc kubenswrapper[4717]: I0217 15:09:54.345548 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 15:09:54 crc kubenswrapper[4717]: W0217 15:09:54.552609 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b7d39d_8183_4e96_a163_b72323ccb0b5.slice/crio-589f6732e50f7cf744d4e27db0bd110d1a02fe5fafcc66cf784a2f140b2d2b81 WatchSource:0}: Error finding container 589f6732e50f7cf744d4e27db0bd110d1a02fe5fafcc66cf784a2f140b2d2b81: Status 404 returned error can't find the container with id 589f6732e50f7cf744d4e27db0bd110d1a02fe5fafcc66cf784a2f140b2d2b81 Feb 17 15:09:54 crc kubenswrapper[4717]: W0217 15:09:54.560183 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ae80c8e_65ba_4b6b_a3d6_6dcb215aca1f.slice/crio-27aeac1eb34d35bdacdfc5e08f21b3a0e0c5c2976e69addeaf57f22fd469b21f WatchSource:0}: Error finding container 27aeac1eb34d35bdacdfc5e08f21b3a0e0c5c2976e69addeaf57f22fd469b21f: Status 404 returned error can't find the container with id 27aeac1eb34d35bdacdfc5e08f21b3a0e0c5c2976e69addeaf57f22fd469b21f Feb 17 15:09:54 crc kubenswrapper[4717]: E0217 15:09:54.596372 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 15:09:54 crc kubenswrapper[4717]: E0217 15:09:54.596570 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pj9rk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-bczjv_openstack(d5dd2394-2d20-49b9-bd73-377d1376aa73): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:09:54 crc kubenswrapper[4717]: E0217 15:09:54.598266 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" podUID="d5dd2394-2d20-49b9-bd73-377d1376aa73" Feb 17 15:09:54 crc kubenswrapper[4717]: E0217 15:09:54.608547 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 15:09:54 crc kubenswrapper[4717]: E0217 15:09:54.608742 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f5kth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-s546s_openstack(39743706-11cd-45b4-ac36-2f4db1e61723): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:09:54 crc kubenswrapper[4717]: E0217 15:09:54.610608 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" podUID="39743706-11cd-45b4-ac36-2f4db1e61723" Feb 17 15:09:54 crc kubenswrapper[4717]: E0217 15:09:54.666604 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 15:09:54 crc kubenswrapper[4717]: E0217 15:09:54.667113 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2k6nn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-xhwc2_openstack(7d89b253-1e16-4f06-a413-db01a9af4574): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:09:54 crc kubenswrapper[4717]: E0217 15:09:54.668319 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" podUID="7d89b253-1e16-4f06-a413-db01a9af4574" Feb 17 15:09:54 crc kubenswrapper[4717]: E0217 15:09:54.678299 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 15:09:54 crc kubenswrapper[4717]: E0217 15:09:54.678488 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btdpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-crgsg_openstack(fb0eb1e2-46e0-4783-bdb4-920c8501a97a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:09:54 crc kubenswrapper[4717]: E0217 15:09:54.679835 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-crgsg" podUID="fb0eb1e2-46e0-4783-bdb4-920c8501a97a" Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.110817 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.315279 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f","Type":"ContainerStarted","Data":"27aeac1eb34d35bdacdfc5e08f21b3a0e0c5c2976e69addeaf57f22fd469b21f"} Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.317884 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8ec72b87-16f5-487e-ae08-a52b5d289bee","Type":"ContainerStarted","Data":"c005f093f6e2950493c0fac199e0bf12fbd7af921dad9b0bb9885d2118f70b1b"} Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.321619 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"f8e20bac773d3781e4315850afd9f8f1df648a8ef53c688d37ae5161d1be4600"} Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.324995 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e905bd78-7554-4636-b508-a2a67078018e","Type":"ContainerStarted","Data":"c192396115878c5165d2a70890fecfdb19fe679f7a6da71a920de31f001223e7"} Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.326912 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k4zc7" event={"ID":"edd6eb53-55b7-4a61-867c-e4bf277af963","Type":"ContainerStarted","Data":"64b2fd6753d966fe27555db28cc9fd1103970311b69678599d1a768058d90e8c"} Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.328814 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5r2q4" event={"ID":"1954ed93-1aa6-4c08-8379-01d047f5da20","Type":"ContainerStarted","Data":"7c7a2c55ae78a054fd1efc2794f7d059a2c893df690eb55475fa0a047c74a73d"} Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.330467 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"14b7d39d-8183-4e96-a163-b72323ccb0b5","Type":"ContainerStarted","Data":"e319194386825ba1cb49a9bbce52b9886c83499fc9dd9102c5d01f066e8a8666"} Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.330534 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"14b7d39d-8183-4e96-a163-b72323ccb0b5","Type":"ContainerStarted","Data":"589f6732e50f7cf744d4e27db0bd110d1a02fe5fafcc66cf784a2f140b2d2b81"} Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.336329 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4","Type":"ContainerStarted","Data":"6a5b3a0a4c2da028c6817fb03fbf6b1f7bfb92a8cfde558957425da814fc5ddd"} Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.341156 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7a71caa4-7d53-466e-8d74-98c814d3afda","Type":"ContainerStarted","Data":"e05598c677927af29a1ff52864a907d7121ccb3347175a310a70b4247b6639e0"} Feb 17 15:09:55 crc kubenswrapper[4717]: E0217 15:09:55.355249 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-crgsg" podUID="fb0eb1e2-46e0-4783-bdb4-920c8501a97a" Feb 17 15:09:55 crc kubenswrapper[4717]: E0217 15:09:55.355328 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" podUID="7d89b253-1e16-4f06-a413-db01a9af4574" Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.805621 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.810353 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.963916 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5kth\" (UniqueName: \"kubernetes.io/projected/39743706-11cd-45b4-ac36-2f4db1e61723-kube-api-access-f5kth\") pod \"39743706-11cd-45b4-ac36-2f4db1e61723\" (UID: \"39743706-11cd-45b4-ac36-2f4db1e61723\") " Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.964068 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dd2394-2d20-49b9-bd73-377d1376aa73-config\") pod \"d5dd2394-2d20-49b9-bd73-377d1376aa73\" (UID: \"d5dd2394-2d20-49b9-bd73-377d1376aa73\") " Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.964154 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39743706-11cd-45b4-ac36-2f4db1e61723-dns-svc\") pod \"39743706-11cd-45b4-ac36-2f4db1e61723\" (UID: \"39743706-11cd-45b4-ac36-2f4db1e61723\") " Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.964286 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39743706-11cd-45b4-ac36-2f4db1e61723-config\") pod \"39743706-11cd-45b4-ac36-2f4db1e61723\" (UID: \"39743706-11cd-45b4-ac36-2f4db1e61723\") " Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.964329 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj9rk\" (UniqueName: \"kubernetes.io/projected/d5dd2394-2d20-49b9-bd73-377d1376aa73-kube-api-access-pj9rk\") pod \"d5dd2394-2d20-49b9-bd73-377d1376aa73\" (UID: \"d5dd2394-2d20-49b9-bd73-377d1376aa73\") " Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.966280 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39743706-11cd-45b4-ac36-2f4db1e61723-config" (OuterVolumeSpecName: "config") pod "39743706-11cd-45b4-ac36-2f4db1e61723" (UID: "39743706-11cd-45b4-ac36-2f4db1e61723"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.966709 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39743706-11cd-45b4-ac36-2f4db1e61723-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39743706-11cd-45b4-ac36-2f4db1e61723" (UID: "39743706-11cd-45b4-ac36-2f4db1e61723"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.970011 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5dd2394-2d20-49b9-bd73-377d1376aa73-config" (OuterVolumeSpecName: "config") pod "d5dd2394-2d20-49b9-bd73-377d1376aa73" (UID: "d5dd2394-2d20-49b9-bd73-377d1376aa73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.990781 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5dd2394-2d20-49b9-bd73-377d1376aa73-kube-api-access-pj9rk" (OuterVolumeSpecName: "kube-api-access-pj9rk") pod "d5dd2394-2d20-49b9-bd73-377d1376aa73" (UID: "d5dd2394-2d20-49b9-bd73-377d1376aa73"). InnerVolumeSpecName "kube-api-access-pj9rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:09:55 crc kubenswrapper[4717]: I0217 15:09:55.999316 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39743706-11cd-45b4-ac36-2f4db1e61723-kube-api-access-f5kth" (OuterVolumeSpecName: "kube-api-access-f5kth") pod "39743706-11cd-45b4-ac36-2f4db1e61723" (UID: "39743706-11cd-45b4-ac36-2f4db1e61723"). InnerVolumeSpecName "kube-api-access-f5kth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.066950 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39743706-11cd-45b4-ac36-2f4db1e61723-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.066998 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj9rk\" (UniqueName: \"kubernetes.io/projected/d5dd2394-2d20-49b9-bd73-377d1376aa73-kube-api-access-pj9rk\") on node \"crc\" DevicePath \"\"" Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.067019 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5kth\" (UniqueName: \"kubernetes.io/projected/39743706-11cd-45b4-ac36-2f4db1e61723-kube-api-access-f5kth\") on node \"crc\" DevicePath \"\"" Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.067037 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dd2394-2d20-49b9-bd73-377d1376aa73-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.067048 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39743706-11cd-45b4-ac36-2f4db1e61723-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.347877 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.347870 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bczjv" event={"ID":"d5dd2394-2d20-49b9-bd73-377d1376aa73","Type":"ContainerDied","Data":"f0cc6f0bd4c07271510a1f84d5fbdae66be18b2171f753e02b82852344daf066"} Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.349680 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" event={"ID":"39743706-11cd-45b4-ac36-2f4db1e61723","Type":"ContainerDied","Data":"0e053d85a29119ebc1ad090a601395a503176f3f9f9261fe6bdfe99a6b40f773"} Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.349713 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-s546s" Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.419140 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bczjv"] Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.424180 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bczjv"] Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.460549 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s546s"] Feb 17 15:09:56 crc kubenswrapper[4717]: I0217 15:09:56.466788 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s546s"] Feb 17 15:09:57 crc kubenswrapper[4717]: I0217 15:09:57.856590 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39743706-11cd-45b4-ac36-2f4db1e61723" path="/var/lib/kubelet/pods/39743706-11cd-45b4-ac36-2f4db1e61723/volumes" Feb 17 15:09:57 crc kubenswrapper[4717]: I0217 15:09:57.857461 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5dd2394-2d20-49b9-bd73-377d1376aa73" path="/var/lib/kubelet/pods/d5dd2394-2d20-49b9-bd73-377d1376aa73/volumes" Feb 17 15:10:01 crc kubenswrapper[4717]: I0217 15:10:01.394363 4717 generic.go:334] "Generic (PLEG): container finished" podID="8ec72b87-16f5-487e-ae08-a52b5d289bee" containerID="c005f093f6e2950493c0fac199e0bf12fbd7af921dad9b0bb9885d2118f70b1b" exitCode=0 Feb 17 15:10:01 crc kubenswrapper[4717]: I0217 15:10:01.394451 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8ec72b87-16f5-487e-ae08-a52b5d289bee","Type":"ContainerDied","Data":"c005f093f6e2950493c0fac199e0bf12fbd7af921dad9b0bb9885d2118f70b1b"} Feb 17 15:10:01 crc kubenswrapper[4717]: I0217 15:10:01.398277 4717 generic.go:334] "Generic (PLEG): container finished" podID="14b7d39d-8183-4e96-a163-b72323ccb0b5" containerID="e319194386825ba1cb49a9bbce52b9886c83499fc9dd9102c5d01f066e8a8666" exitCode=0 Feb 17 15:10:01 crc kubenswrapper[4717]: I0217 15:10:01.398319 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"14b7d39d-8183-4e96-a163-b72323ccb0b5","Type":"ContainerDied","Data":"e319194386825ba1cb49a9bbce52b9886c83499fc9dd9102c5d01f066e8a8666"} Feb 17 15:10:02 crc kubenswrapper[4717]: I0217 15:10:02.406915 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f","Type":"ContainerStarted","Data":"393c974f05dfb37fa307f1dc9ae8399cb1f8b30ed837e8130a02775f9a7d2c21"} Feb 17 15:10:02 crc kubenswrapper[4717]: I0217 15:10:02.407363 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 17 15:10:02 crc kubenswrapper[4717]: I0217 15:10:02.436703 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=24.382232576 podStartE2EDuration="29.436674997s" podCreationTimestamp="2026-02-17 15:09:33 +0000 UTC" firstStartedPulling="2026-02-17 15:09:54.568380496 +0000 UTC m=+1060.984220972" lastFinishedPulling="2026-02-17 15:09:59.622822917 +0000 UTC m=+1066.038663393" observedRunningTime="2026-02-17 15:10:02.422995957 +0000 UTC m=+1068.838836443" watchObservedRunningTime="2026-02-17 15:10:02.436674997 +0000 UTC m=+1068.852515513" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.098582 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xcmtm"] Feb 17 15:10:03 crc kubenswrapper[4717]: E0217 15:10:03.099425 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef2f13d-d544-4b06-82df-7c485a318037" containerName="extract-utilities" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.099448 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef2f13d-d544-4b06-82df-7c485a318037" containerName="extract-utilities" Feb 17 15:10:03 crc kubenswrapper[4717]: E0217 15:10:03.099467 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef2f13d-d544-4b06-82df-7c485a318037" containerName="registry-server" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.099476 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef2f13d-d544-4b06-82df-7c485a318037" containerName="registry-server" Feb 17 15:10:03 crc kubenswrapper[4717]: E0217 15:10:03.099496 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef2f13d-d544-4b06-82df-7c485a318037" containerName="extract-content" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.099504 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef2f13d-d544-4b06-82df-7c485a318037" containerName="extract-content" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.099672 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef2f13d-d544-4b06-82df-7c485a318037" containerName="registry-server" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.100464 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.104417 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.115994 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xcmtm"] Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.198525 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-config\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.198579 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-ovs-rundir\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.198619 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-combined-ca-bundle\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.198646 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-ovn-rundir\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.198667 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.198713 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzgd9\" (UniqueName: \"kubernetes.io/projected/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-kube-api-access-rzgd9\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.259371 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xhwc2"] Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.300447 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-config\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.300504 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-ovs-rundir\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.300549 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-combined-ca-bundle\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.300571 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-ovn-rundir\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.300596 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.300652 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzgd9\" (UniqueName: \"kubernetes.io/projected/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-kube-api-access-rzgd9\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.301739 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-ovn-rundir\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.302271 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-ovs-rundir\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.304249 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-config\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.313189 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-combined-ca-bundle\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.331800 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nn955"] Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.334049 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.337969 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.354614 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzgd9\" (UniqueName: \"kubernetes.io/projected/e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74-kube-api-access-rzgd9\") pod \"ovn-controller-metrics-xcmtm\" (UID: \"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74\") " pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.364357 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.403428 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nn955"] Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.404655 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-nn955\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.404706 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-567sm\" (UniqueName: \"kubernetes.io/projected/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-kube-api-access-567sm\") pod \"dnsmasq-dns-7fd796d7df-nn955\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.404746 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-config\") pod \"dnsmasq-dns-7fd796d7df-nn955\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.404772 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-nn955\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.433604 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xcmtm" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.506277 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-nn955\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.506344 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-567sm\" (UniqueName: \"kubernetes.io/projected/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-kube-api-access-567sm\") pod \"dnsmasq-dns-7fd796d7df-nn955\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.506391 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-config\") pod \"dnsmasq-dns-7fd796d7df-nn955\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.506421 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-nn955\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.507516 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-nn955\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.508752 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-nn955\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.511226 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-config\") pod \"dnsmasq-dns-7fd796d7df-nn955\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.516755 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k4zc7" event={"ID":"edd6eb53-55b7-4a61-867c-e4bf277af963","Type":"ContainerStarted","Data":"5e72cf67ba21cba046eb3d66474911574baed3582ab96a2ca812b43d17cedb7c"} Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.517253 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-k4zc7" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.550041 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-567sm\" (UniqueName: \"kubernetes.io/projected/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-kube-api-access-567sm\") pod \"dnsmasq-dns-7fd796d7df-nn955\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.567647 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8924cebf-3c79-4978-9564-ec8869b9d79a","Type":"ContainerStarted","Data":"9d4afd969ca60da733acf3772d99a5fb8c4614efb5bf99795644d5b1b294843f"} Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.587948 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-k4zc7" podStartSLOduration=18.736686299 podStartE2EDuration="24.587920848s" podCreationTimestamp="2026-02-17 15:09:39 +0000 UTC" firstStartedPulling="2026-02-17 15:09:54.558388192 +0000 UTC m=+1060.974228668" lastFinishedPulling="2026-02-17 15:10:00.409622741 +0000 UTC m=+1066.825463217" observedRunningTime="2026-02-17 15:10:03.569151864 +0000 UTC m=+1069.984992340" watchObservedRunningTime="2026-02-17 15:10:03.587920848 +0000 UTC m=+1070.003761324" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.589962 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8ec72b87-16f5-487e-ae08-a52b5d289bee","Type":"ContainerStarted","Data":"12ed66498723bd456522b9069b6977b239603497d9d53d964f6e4be2b2281fb5"} Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.599878 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5r2q4" event={"ID":"1954ed93-1aa6-4c08-8379-01d047f5da20","Type":"ContainerStarted","Data":"72cba1daafd275dd52522bc3640b84722529909240077d956da229c64352c683"} Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.610593 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"14b7d39d-8183-4e96-a163-b72323ccb0b5","Type":"ContainerStarted","Data":"64bdf276959dcbd5e7e06507152662f2f2104b72b3331fb7039d3b5d0a160386"} Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.671414 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-crgsg"] Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.672328 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=11.072090171 podStartE2EDuration="32.672317819s" podCreationTimestamp="2026-02-17 15:09:31 +0000 UTC" firstStartedPulling="2026-02-17 15:09:32.980590672 +0000 UTC m=+1039.396431148" lastFinishedPulling="2026-02-17 15:09:54.58081832 +0000 UTC m=+1060.996658796" observedRunningTime="2026-02-17 15:10:03.662387046 +0000 UTC m=+1070.078227542" watchObservedRunningTime="2026-02-17 15:10:03.672317819 +0000 UTC m=+1070.088158295" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.672509 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e905bd78-7554-4636-b508-a2a67078018e","Type":"ContainerStarted","Data":"91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e"} Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.675722 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.704621 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4","Type":"ContainerStarted","Data":"a38cc968dfc58b5d48971d6106cd1c4bbd17e819decddaa84850c7c6207e7f28"} Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.718936 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7a71caa4-7d53-466e-8d74-98c814d3afda","Type":"ContainerStarted","Data":"7a0e1384c754849efca14f8c612aed38e2ee55d7130d5e20bd865e6bd452e8e8"} Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.736479 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0eb38f44-bed1-4e65-8de2-9624715baee1","Type":"ContainerStarted","Data":"9742265cb034d254215ebd27d4a013d407abf002fdfd0518586975bbc57eed7b"} Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.751138 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.752220 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-z55jw"] Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.753875 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.760256 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-z55jw"] Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.762795 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.764189 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=31.764168102 podStartE2EDuration="31.764168102s" podCreationTimestamp="2026-02-17 15:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:03.737567075 +0000 UTC m=+1070.153407551" watchObservedRunningTime="2026-02-17 15:10:03.764168102 +0000 UTC m=+1070.180008578" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.788169 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.788863 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.791550 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=20.554863649 podStartE2EDuration="27.79153312s" podCreationTimestamp="2026-02-17 15:09:36 +0000 UTC" firstStartedPulling="2026-02-17 15:09:54.547031969 +0000 UTC m=+1060.962872445" lastFinishedPulling="2026-02-17 15:10:01.78370144 +0000 UTC m=+1068.199541916" observedRunningTime="2026-02-17 15:10:03.783882763 +0000 UTC m=+1070.199723239" watchObservedRunningTime="2026-02-17 15:10:03.79153312 +0000 UTC m=+1070.207373586" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.811933 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z4st\" (UniqueName: \"kubernetes.io/projected/d3f84dac-d50e-4366-921b-e10d201ce421-kube-api-access-6z4st\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.812047 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.812168 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-config\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.812208 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.812239 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.913501 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.913985 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.914467 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z4st\" (UniqueName: \"kubernetes.io/projected/d3f84dac-d50e-4366-921b-e10d201ce421-kube-api-access-6z4st\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.915020 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.915908 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.916529 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-config\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.916697 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-config\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.917252 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.917358 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.935318 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z4st\" (UniqueName: \"kubernetes.io/projected/d3f84dac-d50e-4366-921b-e10d201ce421-kube-api-access-6z4st\") pod \"dnsmasq-dns-86db49b7ff-z55jw\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:03 crc kubenswrapper[4717]: I0217 15:10:03.967292 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.018838 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k6nn\" (UniqueName: \"kubernetes.io/projected/7d89b253-1e16-4f06-a413-db01a9af4574-kube-api-access-2k6nn\") pod \"7d89b253-1e16-4f06-a413-db01a9af4574\" (UID: \"7d89b253-1e16-4f06-a413-db01a9af4574\") " Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.018918 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d89b253-1e16-4f06-a413-db01a9af4574-dns-svc\") pod \"7d89b253-1e16-4f06-a413-db01a9af4574\" (UID: \"7d89b253-1e16-4f06-a413-db01a9af4574\") " Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.018987 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d89b253-1e16-4f06-a413-db01a9af4574-config\") pod \"7d89b253-1e16-4f06-a413-db01a9af4574\" (UID: \"7d89b253-1e16-4f06-a413-db01a9af4574\") " Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.020651 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d89b253-1e16-4f06-a413-db01a9af4574-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d89b253-1e16-4f06-a413-db01a9af4574" (UID: "7d89b253-1e16-4f06-a413-db01a9af4574"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.021737 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d89b253-1e16-4f06-a413-db01a9af4574-config" (OuterVolumeSpecName: "config") pod "7d89b253-1e16-4f06-a413-db01a9af4574" (UID: "7d89b253-1e16-4f06-a413-db01a9af4574"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.039308 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d89b253-1e16-4f06-a413-db01a9af4574-kube-api-access-2k6nn" (OuterVolumeSpecName: "kube-api-access-2k6nn") pod "7d89b253-1e16-4f06-a413-db01a9af4574" (UID: "7d89b253-1e16-4f06-a413-db01a9af4574"). InnerVolumeSpecName "kube-api-access-2k6nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.088563 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.120730 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k6nn\" (UniqueName: \"kubernetes.io/projected/7d89b253-1e16-4f06-a413-db01a9af4574-kube-api-access-2k6nn\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.121208 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d89b253-1e16-4f06-a413-db01a9af4574-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.121218 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d89b253-1e16-4f06-a413-db01a9af4574-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.136726 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.222321 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btdpj\" (UniqueName: \"kubernetes.io/projected/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-kube-api-access-btdpj\") pod \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\" (UID: \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\") " Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.222408 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-dns-svc\") pod \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\" (UID: \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\") " Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.222534 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-config\") pod \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\" (UID: \"fb0eb1e2-46e0-4783-bdb4-920c8501a97a\") " Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.222899 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb0eb1e2-46e0-4783-bdb4-920c8501a97a" (UID: "fb0eb1e2-46e0-4783-bdb4-920c8501a97a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.223168 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.223191 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-config" (OuterVolumeSpecName: "config") pod "fb0eb1e2-46e0-4783-bdb4-920c8501a97a" (UID: "fb0eb1e2-46e0-4783-bdb4-920c8501a97a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.231484 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-kube-api-access-btdpj" (OuterVolumeSpecName: "kube-api-access-btdpj") pod "fb0eb1e2-46e0-4783-bdb4-920c8501a97a" (UID: "fb0eb1e2-46e0-4783-bdb4-920c8501a97a"). InnerVolumeSpecName "kube-api-access-btdpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.307167 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xcmtm"] Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.324384 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btdpj\" (UniqueName: \"kubernetes.io/projected/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-kube-api-access-btdpj\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.324419 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0eb1e2-46e0-4783-bdb4-920c8501a97a-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.398896 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nn955"] Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.570419 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-z55jw"] Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.745381 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-crgsg" event={"ID":"fb0eb1e2-46e0-4783-bdb4-920c8501a97a","Type":"ContainerDied","Data":"0519f37e952d1de6fc06c96a0b14adeca5cc30b6994e398245e352c27d687092"} Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.745464 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-crgsg" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.746978 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" event={"ID":"7d89b253-1e16-4f06-a413-db01a9af4574","Type":"ContainerDied","Data":"6fa49c0a2216287a85f505754556409787b3ddb7ed345ababd475ddcdd5ec848"} Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.747198 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xhwc2" Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.757792 4717 generic.go:334] "Generic (PLEG): container finished" podID="1954ed93-1aa6-4c08-8379-01d047f5da20" containerID="72cba1daafd275dd52522bc3640b84722529909240077d956da229c64352c683" exitCode=0 Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.758341 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5r2q4" event={"ID":"1954ed93-1aa6-4c08-8379-01d047f5da20","Type":"ContainerDied","Data":"72cba1daafd275dd52522bc3640b84722529909240077d956da229c64352c683"} Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.818497 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-crgsg"] Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.826813 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-crgsg"] Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.863787 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xhwc2"] Feb 17 15:10:04 crc kubenswrapper[4717]: I0217 15:10:04.870613 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xhwc2"] Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.767765 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xcmtm" event={"ID":"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74","Type":"ContainerStarted","Data":"309d4ddc6a347b353167a8eb1a35cb80914c1adb35aa98e261fa1e15c5d5906e"} Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.768406 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xcmtm" event={"ID":"e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74","Type":"ContainerStarted","Data":"3f15ce4b30d6113a892cc63f286e67614d638819f38c468f770a3f03902575f1"} Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.770562 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4","Type":"ContainerStarted","Data":"e8944379f5741b23ea2bd11204aee0d77e281053df377a76497fc3485e02a47f"} Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.771933 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" event={"ID":"d3f84dac-d50e-4366-921b-e10d201ce421","Type":"ContainerStarted","Data":"3903b21e269cd99d14a9e15f3f9fc79b7c9aef40152a2d39ca1aff3171d678dd"} Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.771983 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" event={"ID":"d3f84dac-d50e-4366-921b-e10d201ce421","Type":"ContainerStarted","Data":"8ddec66381fc42ae1a94fea56dbd45835719f745c7f97d166406ab6c470fed70"} Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.773286 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7a71caa4-7d53-466e-8d74-98c814d3afda","Type":"ContainerStarted","Data":"b124772e5e21b7411acc3bc4bc0acdcf60614c02eb9646806c046202e2f22b1e"} Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.775359 4717 generic.go:334] "Generic (PLEG): container finished" podID="b1d6082a-9407-4c1d-b36c-eb78f2879a5a" containerID="da44881ebd449e6a3c9e7823eaff4affd52550b0a0597a53f22e0190010252a3" exitCode=0 Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.775418 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" event={"ID":"b1d6082a-9407-4c1d-b36c-eb78f2879a5a","Type":"ContainerDied","Data":"da44881ebd449e6a3c9e7823eaff4affd52550b0a0597a53f22e0190010252a3"} Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.775442 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" event={"ID":"b1d6082a-9407-4c1d-b36c-eb78f2879a5a","Type":"ContainerStarted","Data":"aa3a2c24b8fbba94ad74dc89fbea8ae34729e8100a083232057ee48550ba5c28"} Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.778279 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5r2q4" event={"ID":"1954ed93-1aa6-4c08-8379-01d047f5da20","Type":"ContainerStarted","Data":"b928a83655399be9562dbce83010cd2372fa20defe2b0f248c95337ecc72bad3"} Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.778327 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.778341 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5r2q4" event={"ID":"1954ed93-1aa6-4c08-8379-01d047f5da20","Type":"ContainerStarted","Data":"2eb492fe7ec45cbcd72b39e462886c06c62e580bab72fb392a330cfbe5f76aa3"} Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.778817 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.792647 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xcmtm" podStartSLOduration=2.3961522889999998 podStartE2EDuration="2.792627678s" podCreationTimestamp="2026-02-17 15:10:03 +0000 UTC" firstStartedPulling="2026-02-17 15:10:04.918324316 +0000 UTC m=+1071.334164792" lastFinishedPulling="2026-02-17 15:10:05.314799715 +0000 UTC m=+1071.730640181" observedRunningTime="2026-02-17 15:10:05.785567457 +0000 UTC m=+1072.201407943" watchObservedRunningTime="2026-02-17 15:10:05.792627678 +0000 UTC m=+1072.208468154" Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.851208 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=13.993956971 podStartE2EDuration="23.851192654s" podCreationTimestamp="2026-02-17 15:09:42 +0000 UTC" firstStartedPulling="2026-02-17 15:09:55.109969604 +0000 UTC m=+1061.525810080" lastFinishedPulling="2026-02-17 15:10:04.967205287 +0000 UTC m=+1071.383045763" observedRunningTime="2026-02-17 15:10:05.849359862 +0000 UTC m=+1072.265200348" watchObservedRunningTime="2026-02-17 15:10:05.851192654 +0000 UTC m=+1072.267033140" Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.866158 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d89b253-1e16-4f06-a413-db01a9af4574" path="/var/lib/kubelet/pods/7d89b253-1e16-4f06-a413-db01a9af4574/volumes" Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.874654 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb0eb1e2-46e0-4783-bdb4-920c8501a97a" path="/var/lib/kubelet/pods/fb0eb1e2-46e0-4783-bdb4-920c8501a97a/volumes" Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.913287 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.479055212 podStartE2EDuration="27.91326663s" podCreationTimestamp="2026-02-17 15:09:38 +0000 UTC" firstStartedPulling="2026-02-17 15:09:54.548073008 +0000 UTC m=+1060.963913484" lastFinishedPulling="2026-02-17 15:10:04.982284426 +0000 UTC m=+1071.398124902" observedRunningTime="2026-02-17 15:10:05.911135739 +0000 UTC m=+1072.326976235" watchObservedRunningTime="2026-02-17 15:10:05.91326663 +0000 UTC m=+1072.329107096" Feb 17 15:10:05 crc kubenswrapper[4717]: I0217 15:10:05.944220 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5r2q4" podStartSLOduration=20.637459963 podStartE2EDuration="26.94419867s" podCreationTimestamp="2026-02-17 15:09:39 +0000 UTC" firstStartedPulling="2026-02-17 15:09:54.550980371 +0000 UTC m=+1060.966820847" lastFinishedPulling="2026-02-17 15:10:00.857719058 +0000 UTC m=+1067.273559554" observedRunningTime="2026-02-17 15:10:05.938537669 +0000 UTC m=+1072.354378155" watchObservedRunningTime="2026-02-17 15:10:05.94419867 +0000 UTC m=+1072.360039146" Feb 17 15:10:06 crc kubenswrapper[4717]: I0217 15:10:06.782462 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 17 15:10:06 crc kubenswrapper[4717]: I0217 15:10:06.789560 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" event={"ID":"b1d6082a-9407-4c1d-b36c-eb78f2879a5a","Type":"ContainerStarted","Data":"aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6"} Feb 17 15:10:06 crc kubenswrapper[4717]: I0217 15:10:06.789641 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:06 crc kubenswrapper[4717]: I0217 15:10:06.791196 4717 generic.go:334] "Generic (PLEG): container finished" podID="d3f84dac-d50e-4366-921b-e10d201ce421" containerID="3903b21e269cd99d14a9e15f3f9fc79b7c9aef40152a2d39ca1aff3171d678dd" exitCode=0 Feb 17 15:10:06 crc kubenswrapper[4717]: I0217 15:10:06.791914 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" event={"ID":"d3f84dac-d50e-4366-921b-e10d201ce421","Type":"ContainerDied","Data":"3903b21e269cd99d14a9e15f3f9fc79b7c9aef40152a2d39ca1aff3171d678dd"} Feb 17 15:10:06 crc kubenswrapper[4717]: I0217 15:10:06.811982 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" podStartSLOduration=3.367023351 podStartE2EDuration="3.811960377s" podCreationTimestamp="2026-02-17 15:10:03 +0000 UTC" firstStartedPulling="2026-02-17 15:10:04.92546971 +0000 UTC m=+1071.341310186" lastFinishedPulling="2026-02-17 15:10:05.370406736 +0000 UTC m=+1071.786247212" observedRunningTime="2026-02-17 15:10:06.808989972 +0000 UTC m=+1073.224830468" watchObservedRunningTime="2026-02-17 15:10:06.811960377 +0000 UTC m=+1073.227800863" Feb 17 15:10:06 crc kubenswrapper[4717]: I0217 15:10:06.846452 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 17 15:10:07 crc kubenswrapper[4717]: I0217 15:10:07.434784 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 17 15:10:07 crc kubenswrapper[4717]: I0217 15:10:07.479669 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 17 15:10:07 crc kubenswrapper[4717]: I0217 15:10:07.802758 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" event={"ID":"d3f84dac-d50e-4366-921b-e10d201ce421","Type":"ContainerStarted","Data":"85eacc059669c5b39c9c461429b4070c4c5bafba58fbb5dc845db03726385662"} Feb 17 15:10:07 crc kubenswrapper[4717]: I0217 15:10:07.803726 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:07 crc kubenswrapper[4717]: I0217 15:10:07.803979 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 17 15:10:07 crc kubenswrapper[4717]: I0217 15:10:07.804134 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 17 15:10:07 crc kubenswrapper[4717]: I0217 15:10:07.823780 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" podStartSLOduration=4.301039871 podStartE2EDuration="4.823760491s" podCreationTimestamp="2026-02-17 15:10:03 +0000 UTC" firstStartedPulling="2026-02-17 15:10:04.925213112 +0000 UTC m=+1071.341053588" lastFinishedPulling="2026-02-17 15:10:05.447933732 +0000 UTC m=+1071.863774208" observedRunningTime="2026-02-17 15:10:07.823205795 +0000 UTC m=+1074.239046271" watchObservedRunningTime="2026-02-17 15:10:07.823760491 +0000 UTC m=+1074.239600967" Feb 17 15:10:07 crc kubenswrapper[4717]: I0217 15:10:07.857478 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 17 15:10:07 crc kubenswrapper[4717]: I0217 15:10:07.857534 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.159613 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.160956 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.163324 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.163663 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.165082 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.165409 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-x92vf" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.174797 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.304433 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl99c\" (UniqueName: \"kubernetes.io/projected/a56b9da3-623b-44df-861c-62c9b45566db-kube-api-access-zl99c\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.304764 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a56b9da3-623b-44df-861c-62c9b45566db-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.304919 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a56b9da3-623b-44df-861c-62c9b45566db-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.304991 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a56b9da3-623b-44df-861c-62c9b45566db-scripts\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.305040 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a56b9da3-623b-44df-861c-62c9b45566db-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.305123 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56b9da3-623b-44df-861c-62c9b45566db-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.305234 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56b9da3-623b-44df-861c-62c9b45566db-config\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.407050 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a56b9da3-623b-44df-861c-62c9b45566db-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.407124 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a56b9da3-623b-44df-861c-62c9b45566db-scripts\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.407149 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a56b9da3-623b-44df-861c-62c9b45566db-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.407172 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56b9da3-623b-44df-861c-62c9b45566db-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.407215 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56b9da3-623b-44df-861c-62c9b45566db-config\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.407237 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl99c\" (UniqueName: \"kubernetes.io/projected/a56b9da3-623b-44df-861c-62c9b45566db-kube-api-access-zl99c\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.407296 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a56b9da3-623b-44df-861c-62c9b45566db-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.407772 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a56b9da3-623b-44df-861c-62c9b45566db-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.408721 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56b9da3-623b-44df-861c-62c9b45566db-config\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.408884 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a56b9da3-623b-44df-861c-62c9b45566db-scripts\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.411469 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a56b9da3-623b-44df-861c-62c9b45566db-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.411673 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56b9da3-623b-44df-861c-62c9b45566db-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.413573 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a56b9da3-623b-44df-861c-62c9b45566db-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.435669 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl99c\" (UniqueName: \"kubernetes.io/projected/a56b9da3-623b-44df-861c-62c9b45566db-kube-api-access-zl99c\") pod \"ovn-northd-0\" (UID: \"a56b9da3-623b-44df-861c-62c9b45566db\") " pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.452344 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.483160 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.528453 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.844381 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 17 15:10:08 crc kubenswrapper[4717]: I0217 15:10:08.969515 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 15:10:09 crc kubenswrapper[4717]: I0217 15:10:09.826491 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a56b9da3-623b-44df-861c-62c9b45566db","Type":"ContainerStarted","Data":"2fc9b2c583294fde6a6a26c713c5a4d90f3d781384c8189e8d43c4ce563e5c47"} Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.452308 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.453171 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.509401 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-6jvhf"] Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.510415 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6jvhf" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.518218 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.548499 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6jvhf"] Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.603549 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad94a78f-0c92-4603-863d-f4c50c946028-operator-scripts\") pod \"root-account-create-update-6jvhf\" (UID: \"ad94a78f-0c92-4603-863d-f4c50c946028\") " pod="openstack/root-account-create-update-6jvhf" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.603644 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm6sw\" (UniqueName: \"kubernetes.io/projected/ad94a78f-0c92-4603-863d-f4c50c946028-kube-api-access-rm6sw\") pod \"root-account-create-update-6jvhf\" (UID: \"ad94a78f-0c92-4603-863d-f4c50c946028\") " pod="openstack/root-account-create-update-6jvhf" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.608553 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.705182 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad94a78f-0c92-4603-863d-f4c50c946028-operator-scripts\") pod \"root-account-create-update-6jvhf\" (UID: \"ad94a78f-0c92-4603-863d-f4c50c946028\") " pod="openstack/root-account-create-update-6jvhf" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.705259 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm6sw\" (UniqueName: \"kubernetes.io/projected/ad94a78f-0c92-4603-863d-f4c50c946028-kube-api-access-rm6sw\") pod \"root-account-create-update-6jvhf\" (UID: \"ad94a78f-0c92-4603-863d-f4c50c946028\") " pod="openstack/root-account-create-update-6jvhf" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.706049 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad94a78f-0c92-4603-863d-f4c50c946028-operator-scripts\") pod \"root-account-create-update-6jvhf\" (UID: \"ad94a78f-0c92-4603-863d-f4c50c946028\") " pod="openstack/root-account-create-update-6jvhf" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.725643 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm6sw\" (UniqueName: \"kubernetes.io/projected/ad94a78f-0c92-4603-863d-f4c50c946028-kube-api-access-rm6sw\") pod \"root-account-create-update-6jvhf\" (UID: \"ad94a78f-0c92-4603-863d-f4c50c946028\") " pod="openstack/root-account-create-update-6jvhf" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.850939 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a56b9da3-623b-44df-861c-62c9b45566db","Type":"ContainerStarted","Data":"f9e9acabaac9a3cff833c316770cd17d19a29656b178d1448620e91b4510effd"} Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.851339 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a56b9da3-623b-44df-861c-62c9b45566db","Type":"ContainerStarted","Data":"1dcd5207bb0d3c7190ae5c15cda8b311c81b5dab869da76e939e0f262c403f61"} Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.861415 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6jvhf" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.925101 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 17 15:10:12 crc kubenswrapper[4717]: I0217 15:10:12.955633 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.936137883 podStartE2EDuration="4.955612073s" podCreationTimestamp="2026-02-17 15:10:08 +0000 UTC" firstStartedPulling="2026-02-17 15:10:08.984047119 +0000 UTC m=+1075.399887595" lastFinishedPulling="2026-02-17 15:10:12.003521319 +0000 UTC m=+1078.419361785" observedRunningTime="2026-02-17 15:10:12.874363011 +0000 UTC m=+1079.290203497" watchObservedRunningTime="2026-02-17 15:10:12.955612073 +0000 UTC m=+1079.371452569" Feb 17 15:10:13 crc kubenswrapper[4717]: I0217 15:10:13.340956 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6jvhf"] Feb 17 15:10:13 crc kubenswrapper[4717]: W0217 15:10:13.345813 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad94a78f_0c92_4603_863d_f4c50c946028.slice/crio-5f86d4913958c53b3b3d7a9fe645c3f066f1c9755ad79dc62b9616ff02fbc748 WatchSource:0}: Error finding container 5f86d4913958c53b3b3d7a9fe645c3f066f1c9755ad79dc62b9616ff02fbc748: Status 404 returned error can't find the container with id 5f86d4913958c53b3b3d7a9fe645c3f066f1c9755ad79dc62b9616ff02fbc748 Feb 17 15:10:13 crc kubenswrapper[4717]: I0217 15:10:13.754662 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:13 crc kubenswrapper[4717]: I0217 15:10:13.862140 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6jvhf" event={"ID":"ad94a78f-0c92-4603-863d-f4c50c946028","Type":"ContainerStarted","Data":"84ed6405ea6951a592a433821a319d57e411cedb11f5a4732e6cd0b1381802b8"} Feb 17 15:10:13 crc kubenswrapper[4717]: I0217 15:10:13.862190 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6jvhf" event={"ID":"ad94a78f-0c92-4603-863d-f4c50c946028","Type":"ContainerStarted","Data":"5f86d4913958c53b3b3d7a9fe645c3f066f1c9755ad79dc62b9616ff02fbc748"} Feb 17 15:10:13 crc kubenswrapper[4717]: I0217 15:10:13.862377 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 17 15:10:13 crc kubenswrapper[4717]: I0217 15:10:13.884300 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-6jvhf" podStartSLOduration=1.884284592 podStartE2EDuration="1.884284592s" podCreationTimestamp="2026-02-17 15:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:13.88034514 +0000 UTC m=+1080.296185636" watchObservedRunningTime="2026-02-17 15:10:13.884284592 +0000 UTC m=+1080.300125068" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.091417 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.147260 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nn955"] Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.147562 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" podUID="b1d6082a-9407-4c1d-b36c-eb78f2879a5a" containerName="dnsmasq-dns" containerID="cri-o://aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6" gracePeriod=10 Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.586177 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.646846 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-ovsdbserver-nb\") pod \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.646950 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-dns-svc\") pod \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.647058 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-567sm\" (UniqueName: \"kubernetes.io/projected/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-kube-api-access-567sm\") pod \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.647189 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-config\") pod \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\" (UID: \"b1d6082a-9407-4c1d-b36c-eb78f2879a5a\") " Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.652841 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-kube-api-access-567sm" (OuterVolumeSpecName: "kube-api-access-567sm") pod "b1d6082a-9407-4c1d-b36c-eb78f2879a5a" (UID: "b1d6082a-9407-4c1d-b36c-eb78f2879a5a"). InnerVolumeSpecName "kube-api-access-567sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.687828 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1d6082a-9407-4c1d-b36c-eb78f2879a5a" (UID: "b1d6082a-9407-4c1d-b36c-eb78f2879a5a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.696994 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-config" (OuterVolumeSpecName: "config") pod "b1d6082a-9407-4c1d-b36c-eb78f2879a5a" (UID: "b1d6082a-9407-4c1d-b36c-eb78f2879a5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.698519 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b1d6082a-9407-4c1d-b36c-eb78f2879a5a" (UID: "b1d6082a-9407-4c1d-b36c-eb78f2879a5a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.749050 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.749100 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-567sm\" (UniqueName: \"kubernetes.io/projected/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-kube-api-access-567sm\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.749116 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.749124 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1d6082a-9407-4c1d-b36c-eb78f2879a5a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.869862 4717 generic.go:334] "Generic (PLEG): container finished" podID="ad94a78f-0c92-4603-863d-f4c50c946028" containerID="84ed6405ea6951a592a433821a319d57e411cedb11f5a4732e6cd0b1381802b8" exitCode=0 Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.869952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6jvhf" event={"ID":"ad94a78f-0c92-4603-863d-f4c50c946028","Type":"ContainerDied","Data":"84ed6405ea6951a592a433821a319d57e411cedb11f5a4732e6cd0b1381802b8"} Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.871920 4717 generic.go:334] "Generic (PLEG): container finished" podID="b1d6082a-9407-4c1d-b36c-eb78f2879a5a" containerID="aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6" exitCode=0 Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.871988 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.872049 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" event={"ID":"b1d6082a-9407-4c1d-b36c-eb78f2879a5a","Type":"ContainerDied","Data":"aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6"} Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.872150 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nn955" event={"ID":"b1d6082a-9407-4c1d-b36c-eb78f2879a5a","Type":"ContainerDied","Data":"aa3a2c24b8fbba94ad74dc89fbea8ae34729e8100a083232057ee48550ba5c28"} Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.872274 4717 scope.go:117] "RemoveContainer" containerID="aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.903957 4717 scope.go:117] "RemoveContainer" containerID="da44881ebd449e6a3c9e7823eaff4affd52550b0a0597a53f22e0190010252a3" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.919472 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-fc2mw"] Feb 17 15:10:14 crc kubenswrapper[4717]: E0217 15:10:14.923186 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d6082a-9407-4c1d-b36c-eb78f2879a5a" containerName="dnsmasq-dns" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.923221 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d6082a-9407-4c1d-b36c-eb78f2879a5a" containerName="dnsmasq-dns" Feb 17 15:10:14 crc kubenswrapper[4717]: E0217 15:10:14.923273 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d6082a-9407-4c1d-b36c-eb78f2879a5a" containerName="init" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.923281 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d6082a-9407-4c1d-b36c-eb78f2879a5a" containerName="init" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.924141 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d6082a-9407-4c1d-b36c-eb78f2879a5a" containerName="dnsmasq-dns" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.949900 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fc2mw" Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.959956 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nn955"] Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.978974 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nn955"] Feb 17 15:10:14 crc kubenswrapper[4717]: I0217 15:10:14.994552 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fc2mw"] Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.007400 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-dfdd-account-create-update-gm7bt"] Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.010039 4717 scope.go:117] "RemoveContainer" containerID="aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6" Feb 17 15:10:15 crc kubenswrapper[4717]: E0217 15:10:15.010720 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6\": container with ID starting with aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6 not found: ID does not exist" containerID="aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.010755 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6"} err="failed to get container status \"aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6\": rpc error: code = NotFound desc = could not find container \"aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6\": container with ID starting with aa3d06c9efc9a641cb9bd6faa84de9dab4464986ba71581c8f53191a304b09e6 not found: ID does not exist" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.010784 4717 scope.go:117] "RemoveContainer" containerID="da44881ebd449e6a3c9e7823eaff4affd52550b0a0597a53f22e0190010252a3" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.010798 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dfdd-account-create-update-gm7bt" Feb 17 15:10:15 crc kubenswrapper[4717]: E0217 15:10:15.011284 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da44881ebd449e6a3c9e7823eaff4affd52550b0a0597a53f22e0190010252a3\": container with ID starting with da44881ebd449e6a3c9e7823eaff4affd52550b0a0597a53f22e0190010252a3 not found: ID does not exist" containerID="da44881ebd449e6a3c9e7823eaff4affd52550b0a0597a53f22e0190010252a3" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.011313 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da44881ebd449e6a3c9e7823eaff4affd52550b0a0597a53f22e0190010252a3"} err="failed to get container status \"da44881ebd449e6a3c9e7823eaff4affd52550b0a0597a53f22e0190010252a3\": rpc error: code = NotFound desc = could not find container \"da44881ebd449e6a3c9e7823eaff4affd52550b0a0597a53f22e0190010252a3\": container with ID starting with da44881ebd449e6a3c9e7823eaff4affd52550b0a0597a53f22e0190010252a3 not found: ID does not exist" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.013750 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.016122 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dfdd-account-create-update-gm7bt"] Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.054454 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92c72e4-581d-4c2d-939b-7a7403142041-operator-scripts\") pod \"keystone-db-create-fc2mw\" (UID: \"c92c72e4-581d-4c2d-939b-7a7403142041\") " pod="openstack/keystone-db-create-fc2mw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.054511 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p99qp\" (UniqueName: \"kubernetes.io/projected/c92c72e4-581d-4c2d-939b-7a7403142041-kube-api-access-p99qp\") pod \"keystone-db-create-fc2mw\" (UID: \"c92c72e4-581d-4c2d-939b-7a7403142041\") " pod="openstack/keystone-db-create-fc2mw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.054556 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8lcv\" (UniqueName: \"kubernetes.io/projected/4fb161a2-7b80-400b-a4eb-ca14e4386721-kube-api-access-k8lcv\") pod \"keystone-dfdd-account-create-update-gm7bt\" (UID: \"4fb161a2-7b80-400b-a4eb-ca14e4386721\") " pod="openstack/keystone-dfdd-account-create-update-gm7bt" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.054648 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb161a2-7b80-400b-a4eb-ca14e4386721-operator-scripts\") pod \"keystone-dfdd-account-create-update-gm7bt\" (UID: \"4fb161a2-7b80-400b-a4eb-ca14e4386721\") " pod="openstack/keystone-dfdd-account-create-update-gm7bt" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.147568 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-fxcxg"] Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.148611 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-fxcxg" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.157441 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsm8v\" (UniqueName: \"kubernetes.io/projected/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea-kube-api-access-bsm8v\") pod \"placement-db-create-fxcxg\" (UID: \"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea\") " pod="openstack/placement-db-create-fxcxg" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.157489 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92c72e4-581d-4c2d-939b-7a7403142041-operator-scripts\") pod \"keystone-db-create-fc2mw\" (UID: \"c92c72e4-581d-4c2d-939b-7a7403142041\") " pod="openstack/keystone-db-create-fc2mw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.157526 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p99qp\" (UniqueName: \"kubernetes.io/projected/c92c72e4-581d-4c2d-939b-7a7403142041-kube-api-access-p99qp\") pod \"keystone-db-create-fc2mw\" (UID: \"c92c72e4-581d-4c2d-939b-7a7403142041\") " pod="openstack/keystone-db-create-fc2mw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.157567 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8lcv\" (UniqueName: \"kubernetes.io/projected/4fb161a2-7b80-400b-a4eb-ca14e4386721-kube-api-access-k8lcv\") pod \"keystone-dfdd-account-create-update-gm7bt\" (UID: \"4fb161a2-7b80-400b-a4eb-ca14e4386721\") " pod="openstack/keystone-dfdd-account-create-update-gm7bt" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.157601 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea-operator-scripts\") pod \"placement-db-create-fxcxg\" (UID: \"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea\") " pod="openstack/placement-db-create-fxcxg" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.157641 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb161a2-7b80-400b-a4eb-ca14e4386721-operator-scripts\") pod \"keystone-dfdd-account-create-update-gm7bt\" (UID: \"4fb161a2-7b80-400b-a4eb-ca14e4386721\") " pod="openstack/keystone-dfdd-account-create-update-gm7bt" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.158324 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92c72e4-581d-4c2d-939b-7a7403142041-operator-scripts\") pod \"keystone-db-create-fc2mw\" (UID: \"c92c72e4-581d-4c2d-939b-7a7403142041\") " pod="openstack/keystone-db-create-fc2mw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.158402 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb161a2-7b80-400b-a4eb-ca14e4386721-operator-scripts\") pod \"keystone-dfdd-account-create-update-gm7bt\" (UID: \"4fb161a2-7b80-400b-a4eb-ca14e4386721\") " pod="openstack/keystone-dfdd-account-create-update-gm7bt" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.158613 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-fxcxg"] Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.176857 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8lcv\" (UniqueName: \"kubernetes.io/projected/4fb161a2-7b80-400b-a4eb-ca14e4386721-kube-api-access-k8lcv\") pod \"keystone-dfdd-account-create-update-gm7bt\" (UID: \"4fb161a2-7b80-400b-a4eb-ca14e4386721\") " pod="openstack/keystone-dfdd-account-create-update-gm7bt" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.181813 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p99qp\" (UniqueName: \"kubernetes.io/projected/c92c72e4-581d-4c2d-939b-7a7403142041-kube-api-access-p99qp\") pod \"keystone-db-create-fc2mw\" (UID: \"c92c72e4-581d-4c2d-939b-7a7403142041\") " pod="openstack/keystone-db-create-fc2mw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.254836 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-468e-account-create-update-pf2dw"] Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.256065 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-468e-account-create-update-pf2dw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.258048 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.259372 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsm8v\" (UniqueName: \"kubernetes.io/projected/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea-kube-api-access-bsm8v\") pod \"placement-db-create-fxcxg\" (UID: \"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea\") " pod="openstack/placement-db-create-fxcxg" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.259561 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea-operator-scripts\") pod \"placement-db-create-fxcxg\" (UID: \"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea\") " pod="openstack/placement-db-create-fxcxg" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.260422 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea-operator-scripts\") pod \"placement-db-create-fxcxg\" (UID: \"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea\") " pod="openstack/placement-db-create-fxcxg" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.270495 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-468e-account-create-update-pf2dw"] Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.276592 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsm8v\" (UniqueName: \"kubernetes.io/projected/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea-kube-api-access-bsm8v\") pod \"placement-db-create-fxcxg\" (UID: \"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea\") " pod="openstack/placement-db-create-fxcxg" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.310008 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fc2mw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.338444 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dfdd-account-create-update-gm7bt" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.360914 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0e668f2-e8a2-403b-af75-9ea042dad451-operator-scripts\") pod \"placement-468e-account-create-update-pf2dw\" (UID: \"c0e668f2-e8a2-403b-af75-9ea042dad451\") " pod="openstack/placement-468e-account-create-update-pf2dw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.360987 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjf2h\" (UniqueName: \"kubernetes.io/projected/c0e668f2-e8a2-403b-af75-9ea042dad451-kube-api-access-hjf2h\") pod \"placement-468e-account-create-update-pf2dw\" (UID: \"c0e668f2-e8a2-403b-af75-9ea042dad451\") " pod="openstack/placement-468e-account-create-update-pf2dw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.462431 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0e668f2-e8a2-403b-af75-9ea042dad451-operator-scripts\") pod \"placement-468e-account-create-update-pf2dw\" (UID: \"c0e668f2-e8a2-403b-af75-9ea042dad451\") " pod="openstack/placement-468e-account-create-update-pf2dw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.462776 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjf2h\" (UniqueName: \"kubernetes.io/projected/c0e668f2-e8a2-403b-af75-9ea042dad451-kube-api-access-hjf2h\") pod \"placement-468e-account-create-update-pf2dw\" (UID: \"c0e668f2-e8a2-403b-af75-9ea042dad451\") " pod="openstack/placement-468e-account-create-update-pf2dw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.463473 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0e668f2-e8a2-403b-af75-9ea042dad451-operator-scripts\") pod \"placement-468e-account-create-update-pf2dw\" (UID: \"c0e668f2-e8a2-403b-af75-9ea042dad451\") " pod="openstack/placement-468e-account-create-update-pf2dw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.463830 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-fxcxg" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.483657 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjf2h\" (UniqueName: \"kubernetes.io/projected/c0e668f2-e8a2-403b-af75-9ea042dad451-kube-api-access-hjf2h\") pod \"placement-468e-account-create-update-pf2dw\" (UID: \"c0e668f2-e8a2-403b-af75-9ea042dad451\") " pod="openstack/placement-468e-account-create-update-pf2dw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.723003 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-468e-account-create-update-pf2dw" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.751959 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fc2mw"] Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.873700 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.882143 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d6082a-9407-4c1d-b36c-eb78f2879a5a" path="/var/lib/kubelet/pods/b1d6082a-9407-4c1d-b36c-eb78f2879a5a/volumes" Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.882732 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dfdd-account-create-update-gm7bt"] Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.889404 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dfdd-account-create-update-gm7bt" event={"ID":"4fb161a2-7b80-400b-a4eb-ca14e4386721","Type":"ContainerStarted","Data":"d5eeed32fe017f446e2cf2fcb26ad6dc6007749f9db92d73dcff309130541f8b"} Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.891412 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fc2mw" event={"ID":"c92c72e4-581d-4c2d-939b-7a7403142041","Type":"ContainerStarted","Data":"74ca1cb943e120db201565e99819b51637c4835605225a23a50475f549ce79e8"} Feb 17 15:10:15 crc kubenswrapper[4717]: I0217 15:10:15.955173 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-fxcxg"] Feb 17 15:10:15 crc kubenswrapper[4717]: W0217 15:10:15.955261 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76fdb52f_cf38_4b2d_8997_ebcb0395e6ea.slice/crio-8698e5a00919ce102aaa6ccbffe87cd39b0f1397878abdc02a13fd8be3c2a54c WatchSource:0}: Error finding container 8698e5a00919ce102aaa6ccbffe87cd39b0f1397878abdc02a13fd8be3c2a54c: Status 404 returned error can't find the container with id 8698e5a00919ce102aaa6ccbffe87cd39b0f1397878abdc02a13fd8be3c2a54c Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.183862 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6jvhf" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.218546 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-468e-account-create-update-pf2dw"] Feb 17 15:10:16 crc kubenswrapper[4717]: W0217 15:10:16.218629 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0e668f2_e8a2_403b_af75_9ea042dad451.slice/crio-ae4d3cc997cd2a2fb40c7829d926da0f0a28c5e6aec8919e7ab1937a949d636b WatchSource:0}: Error finding container ae4d3cc997cd2a2fb40c7829d926da0f0a28c5e6aec8919e7ab1937a949d636b: Status 404 returned error can't find the container with id ae4d3cc997cd2a2fb40c7829d926da0f0a28c5e6aec8919e7ab1937a949d636b Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.223437 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.275629 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm6sw\" (UniqueName: \"kubernetes.io/projected/ad94a78f-0c92-4603-863d-f4c50c946028-kube-api-access-rm6sw\") pod \"ad94a78f-0c92-4603-863d-f4c50c946028\" (UID: \"ad94a78f-0c92-4603-863d-f4c50c946028\") " Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.275687 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad94a78f-0c92-4603-863d-f4c50c946028-operator-scripts\") pod \"ad94a78f-0c92-4603-863d-f4c50c946028\" (UID: \"ad94a78f-0c92-4603-863d-f4c50c946028\") " Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.276796 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad94a78f-0c92-4603-863d-f4c50c946028-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad94a78f-0c92-4603-863d-f4c50c946028" (UID: "ad94a78f-0c92-4603-863d-f4c50c946028"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.282596 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad94a78f-0c92-4603-863d-f4c50c946028-kube-api-access-rm6sw" (OuterVolumeSpecName: "kube-api-access-rm6sw") pod "ad94a78f-0c92-4603-863d-f4c50c946028" (UID: "ad94a78f-0c92-4603-863d-f4c50c946028"). InnerVolumeSpecName "kube-api-access-rm6sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.377612 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm6sw\" (UniqueName: \"kubernetes.io/projected/ad94a78f-0c92-4603-863d-f4c50c946028-kube-api-access-rm6sw\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.377666 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad94a78f-0c92-4603-863d-f4c50c946028-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.435558 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-g7mg8"] Feb 17 15:10:16 crc kubenswrapper[4717]: E0217 15:10:16.436373 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad94a78f-0c92-4603-863d-f4c50c946028" containerName="mariadb-account-create-update" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.436399 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad94a78f-0c92-4603-863d-f4c50c946028" containerName="mariadb-account-create-update" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.445547 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad94a78f-0c92-4603-863d-f4c50c946028" containerName="mariadb-account-create-update" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.446802 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.449429 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.487404 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-g7mg8"] Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.587434 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vch7q\" (UniqueName: \"kubernetes.io/projected/29c6b382-3315-4904-8fd9-dc7f1c993c2b-kube-api-access-vch7q\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.587694 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.587720 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.587766 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-dns-svc\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.587792 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-config\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.690152 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.690303 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-dns-svc\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.690359 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-config\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.690648 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vch7q\" (UniqueName: \"kubernetes.io/projected/29c6b382-3315-4904-8fd9-dc7f1c993c2b-kube-api-access-vch7q\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.690704 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.691635 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.691935 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.691948 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-dns-svc\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.692346 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-config\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.709450 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vch7q\" (UniqueName: \"kubernetes.io/projected/29c6b382-3315-4904-8fd9-dc7f1c993c2b-kube-api-access-vch7q\") pod \"dnsmasq-dns-698758b865-g7mg8\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.811775 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.913500 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6jvhf" event={"ID":"ad94a78f-0c92-4603-863d-f4c50c946028","Type":"ContainerDied","Data":"5f86d4913958c53b3b3d7a9fe645c3f066f1c9755ad79dc62b9616ff02fbc748"} Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.913531 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6jvhf" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.913544 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f86d4913958c53b3b3d7a9fe645c3f066f1c9755ad79dc62b9616ff02fbc748" Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.931014 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-fxcxg" event={"ID":"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea","Type":"ContainerStarted","Data":"8698e5a00919ce102aaa6ccbffe87cd39b0f1397878abdc02a13fd8be3c2a54c"} Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.933727 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-468e-account-create-update-pf2dw" event={"ID":"c0e668f2-e8a2-403b-af75-9ea042dad451","Type":"ContainerStarted","Data":"ae4d3cc997cd2a2fb40c7829d926da0f0a28c5e6aec8919e7ab1937a949d636b"} Feb 17 15:10:16 crc kubenswrapper[4717]: I0217 15:10:16.935565 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fc2mw" event={"ID":"c92c72e4-581d-4c2d-939b-7a7403142041","Type":"ContainerStarted","Data":"d8c75d7b87cd1ef5c0b5186198f05269382351e290640d8dc86e9ecc80ebdee6"} Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.270245 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-g7mg8"] Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.644511 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.656301 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.659267 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-n68sm" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.659652 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.659763 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.660345 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.692793 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.816752 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjvc6\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-kube-api-access-jjvc6\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.817126 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.817221 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.817254 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/518c6b94-614f-42fd-9016-122cdcfcb8c9-lock\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.817286 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/518c6b94-614f-42fd-9016-122cdcfcb8c9-cache\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.817506 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518c6b94-614f-42fd-9016-122cdcfcb8c9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.919986 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.920111 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/518c6b94-614f-42fd-9016-122cdcfcb8c9-lock\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.920160 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/518c6b94-614f-42fd-9016-122cdcfcb8c9-cache\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.920274 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518c6b94-614f-42fd-9016-122cdcfcb8c9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.920348 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjvc6\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-kube-api-access-jjvc6\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.920412 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.920541 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.920712 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/518c6b94-614f-42fd-9016-122cdcfcb8c9-cache\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: E0217 15:10:17.920827 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:10:17 crc kubenswrapper[4717]: E0217 15:10:17.920851 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:10:17 crc kubenswrapper[4717]: E0217 15:10:17.920891 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift podName:518c6b94-614f-42fd-9016-122cdcfcb8c9 nodeName:}" failed. No retries permitted until 2026-02-17 15:10:18.420875157 +0000 UTC m=+1084.836715633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift") pod "swift-storage-0" (UID: "518c6b94-614f-42fd-9016-122cdcfcb8c9") : configmap "swift-ring-files" not found Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.921309 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/518c6b94-614f-42fd-9016-122cdcfcb8c9-lock\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.927634 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518c6b94-614f-42fd-9016-122cdcfcb8c9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.936850 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjvc6\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-kube-api-access-jjvc6\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.943031 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.944681 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dfdd-account-create-update-gm7bt" event={"ID":"4fb161a2-7b80-400b-a4eb-ca14e4386721","Type":"ContainerStarted","Data":"4c52b97c2f1f802fa203b77aa3e96251c4bfb84f3f0c1f41b12f110bdbce3bbb"} Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.946553 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-g7mg8" event={"ID":"29c6b382-3315-4904-8fd9-dc7f1c993c2b","Type":"ContainerStarted","Data":"11038af40f1c6abf543922a6b192b7b794561349a6e33519d7c678ffcff325ab"} Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.949821 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-fxcxg" event={"ID":"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea","Type":"ContainerStarted","Data":"5a2e52a7881ad868f34911f20aaa2de38d025d14a3689528ca22d9d37e1c0e1f"} Feb 17 15:10:17 crc kubenswrapper[4717]: I0217 15:10:17.970269 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-fc2mw" podStartSLOduration=3.970250381 podStartE2EDuration="3.970250381s" podCreationTimestamp="2026-02-17 15:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:17.968157342 +0000 UTC m=+1084.383997868" watchObservedRunningTime="2026-02-17 15:10:17.970250381 +0000 UTC m=+1084.386090867" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.150269 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rhzj6"] Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.151410 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.153479 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.154222 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.154226 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.167157 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rhzj6"] Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.224518 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-dispersionconf\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.224559 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svjk8\" (UniqueName: \"kubernetes.io/projected/929feb4c-fd82-4293-9a44-a6f53816cdae-kube-api-access-svjk8\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.224587 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/929feb4c-fd82-4293-9a44-a6f53816cdae-ring-data-devices\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.224628 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/929feb4c-fd82-4293-9a44-a6f53816cdae-scripts\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.224648 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-combined-ca-bundle\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.224673 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/929feb4c-fd82-4293-9a44-a6f53816cdae-etc-swift\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.224707 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-swiftconf\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.326360 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/929feb4c-fd82-4293-9a44-a6f53816cdae-ring-data-devices\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.326439 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/929feb4c-fd82-4293-9a44-a6f53816cdae-scripts\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.326470 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-combined-ca-bundle\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.326497 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/929feb4c-fd82-4293-9a44-a6f53816cdae-etc-swift\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.326544 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-swiftconf\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.326639 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-dispersionconf\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.326668 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svjk8\" (UniqueName: \"kubernetes.io/projected/929feb4c-fd82-4293-9a44-a6f53816cdae-kube-api-access-svjk8\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.327360 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/929feb4c-fd82-4293-9a44-a6f53816cdae-ring-data-devices\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.327555 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/929feb4c-fd82-4293-9a44-a6f53816cdae-scripts\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.327580 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/929feb4c-fd82-4293-9a44-a6f53816cdae-etc-swift\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.330401 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-swiftconf\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.331034 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-dispersionconf\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.331067 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-combined-ca-bundle\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.343908 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svjk8\" (UniqueName: \"kubernetes.io/projected/929feb4c-fd82-4293-9a44-a6f53816cdae-kube-api-access-svjk8\") pod \"swift-ring-rebalance-rhzj6\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.429119 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:18 crc kubenswrapper[4717]: E0217 15:10:18.429311 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:10:18 crc kubenswrapper[4717]: E0217 15:10:18.429688 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:10:18 crc kubenswrapper[4717]: E0217 15:10:18.429802 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift podName:518c6b94-614f-42fd-9016-122cdcfcb8c9 nodeName:}" failed. No retries permitted until 2026-02-17 15:10:19.429786954 +0000 UTC m=+1085.845627430 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift") pod "swift-storage-0" (UID: "518c6b94-614f-42fd-9016-122cdcfcb8c9") : configmap "swift-ring-files" not found Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.531106 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.958039 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-468e-account-create-update-pf2dw" event={"ID":"c0e668f2-e8a2-403b-af75-9ea042dad451","Type":"ContainerStarted","Data":"17080c43a13839d51643cf8b58c0f9b6a44de0d4e29acd113c75e4b57e57a55b"} Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.961759 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-g7mg8" event={"ID":"29c6b382-3315-4904-8fd9-dc7f1c993c2b","Type":"ContainerStarted","Data":"7ec9aca08d8dd4a2a00430f320c52fee44835265cc02485a134fda7634f93c2e"} Feb 17 15:10:18 crc kubenswrapper[4717]: I0217 15:10:18.977279 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-468e-account-create-update-pf2dw" podStartSLOduration=3.977257689 podStartE2EDuration="3.977257689s" podCreationTimestamp="2026-02-17 15:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:18.971913677 +0000 UTC m=+1085.387754163" watchObservedRunningTime="2026-02-17 15:10:18.977257689 +0000 UTC m=+1085.393098165" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.024828 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-tcpcx"] Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.026810 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tcpcx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.035769 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-dfdd-account-create-update-gm7bt" podStartSLOduration=5.035752643 podStartE2EDuration="5.035752643s" podCreationTimestamp="2026-02-17 15:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:18.994532431 +0000 UTC m=+1085.410372917" watchObservedRunningTime="2026-02-17 15:10:19.035752643 +0000 UTC m=+1085.451593119" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.057776 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-fxcxg" podStartSLOduration=4.057757169 podStartE2EDuration="4.057757169s" podCreationTimestamp="2026-02-17 15:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:19.02474774 +0000 UTC m=+1085.440588226" watchObservedRunningTime="2026-02-17 15:10:19.057757169 +0000 UTC m=+1085.473597645" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.057964 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-tcpcx"] Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.070280 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rhzj6"] Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.114401 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e750-account-create-update-kv9rx"] Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.115766 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e750-account-create-update-kv9rx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.119611 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.123611 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e750-account-create-update-kv9rx"] Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.141742 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rtht\" (UniqueName: \"kubernetes.io/projected/5b3786e2-123b-4300-86fb-505483139531-kube-api-access-2rtht\") pod \"glance-db-create-tcpcx\" (UID: \"5b3786e2-123b-4300-86fb-505483139531\") " pod="openstack/glance-db-create-tcpcx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.141930 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3786e2-123b-4300-86fb-505483139531-operator-scripts\") pod \"glance-db-create-tcpcx\" (UID: \"5b3786e2-123b-4300-86fb-505483139531\") " pod="openstack/glance-db-create-tcpcx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.243436 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3786e2-123b-4300-86fb-505483139531-operator-scripts\") pod \"glance-db-create-tcpcx\" (UID: \"5b3786e2-123b-4300-86fb-505483139531\") " pod="openstack/glance-db-create-tcpcx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.243679 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9527cb55-cce6-4fb4-aa5c-b2495a6264f6-operator-scripts\") pod \"glance-e750-account-create-update-kv9rx\" (UID: \"9527cb55-cce6-4fb4-aa5c-b2495a6264f6\") " pod="openstack/glance-e750-account-create-update-kv9rx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.243988 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9mxw\" (UniqueName: \"kubernetes.io/projected/9527cb55-cce6-4fb4-aa5c-b2495a6264f6-kube-api-access-l9mxw\") pod \"glance-e750-account-create-update-kv9rx\" (UID: \"9527cb55-cce6-4fb4-aa5c-b2495a6264f6\") " pod="openstack/glance-e750-account-create-update-kv9rx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.244046 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3786e2-123b-4300-86fb-505483139531-operator-scripts\") pod \"glance-db-create-tcpcx\" (UID: \"5b3786e2-123b-4300-86fb-505483139531\") " pod="openstack/glance-db-create-tcpcx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.244177 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rtht\" (UniqueName: \"kubernetes.io/projected/5b3786e2-123b-4300-86fb-505483139531-kube-api-access-2rtht\") pod \"glance-db-create-tcpcx\" (UID: \"5b3786e2-123b-4300-86fb-505483139531\") " pod="openstack/glance-db-create-tcpcx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.264028 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rtht\" (UniqueName: \"kubernetes.io/projected/5b3786e2-123b-4300-86fb-505483139531-kube-api-access-2rtht\") pod \"glance-db-create-tcpcx\" (UID: \"5b3786e2-123b-4300-86fb-505483139531\") " pod="openstack/glance-db-create-tcpcx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.349199 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tcpcx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.350530 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9527cb55-cce6-4fb4-aa5c-b2495a6264f6-operator-scripts\") pod \"glance-e750-account-create-update-kv9rx\" (UID: \"9527cb55-cce6-4fb4-aa5c-b2495a6264f6\") " pod="openstack/glance-e750-account-create-update-kv9rx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.350695 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9mxw\" (UniqueName: \"kubernetes.io/projected/9527cb55-cce6-4fb4-aa5c-b2495a6264f6-kube-api-access-l9mxw\") pod \"glance-e750-account-create-update-kv9rx\" (UID: \"9527cb55-cce6-4fb4-aa5c-b2495a6264f6\") " pod="openstack/glance-e750-account-create-update-kv9rx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.351234 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9527cb55-cce6-4fb4-aa5c-b2495a6264f6-operator-scripts\") pod \"glance-e750-account-create-update-kv9rx\" (UID: \"9527cb55-cce6-4fb4-aa5c-b2495a6264f6\") " pod="openstack/glance-e750-account-create-update-kv9rx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.370103 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9mxw\" (UniqueName: \"kubernetes.io/projected/9527cb55-cce6-4fb4-aa5c-b2495a6264f6-kube-api-access-l9mxw\") pod \"glance-e750-account-create-update-kv9rx\" (UID: \"9527cb55-cce6-4fb4-aa5c-b2495a6264f6\") " pod="openstack/glance-e750-account-create-update-kv9rx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.435035 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e750-account-create-update-kv9rx" Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.452258 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:19 crc kubenswrapper[4717]: E0217 15:10:19.452527 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:10:19 crc kubenswrapper[4717]: E0217 15:10:19.452556 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:10:19 crc kubenswrapper[4717]: E0217 15:10:19.452625 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift podName:518c6b94-614f-42fd-9016-122cdcfcb8c9 nodeName:}" failed. No retries permitted until 2026-02-17 15:10:21.452603931 +0000 UTC m=+1087.868444407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift") pod "swift-storage-0" (UID: "518c6b94-614f-42fd-9016-122cdcfcb8c9") : configmap "swift-ring-files" not found Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.863355 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-tcpcx"] Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.954311 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e750-account-create-update-kv9rx"] Feb 17 15:10:19 crc kubenswrapper[4717]: W0217 15:10:19.957307 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9527cb55_cce6_4fb4_aa5c_b2495a6264f6.slice/crio-678cee376887c8fdc352fecf72b05fa372b52f7f5421f7c94c16b989b13941af WatchSource:0}: Error finding container 678cee376887c8fdc352fecf72b05fa372b52f7f5421f7c94c16b989b13941af: Status 404 returned error can't find the container with id 678cee376887c8fdc352fecf72b05fa372b52f7f5421f7c94c16b989b13941af Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.975948 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tcpcx" event={"ID":"5b3786e2-123b-4300-86fb-505483139531","Type":"ContainerStarted","Data":"2ab468ab633c9dccf0c1c273c22a02be6d2a14b51d33389b69a725dc59dec969"} Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.977180 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e750-account-create-update-kv9rx" event={"ID":"9527cb55-cce6-4fb4-aa5c-b2495a6264f6","Type":"ContainerStarted","Data":"678cee376887c8fdc352fecf72b05fa372b52f7f5421f7c94c16b989b13941af"} Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.978129 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rhzj6" event={"ID":"929feb4c-fd82-4293-9a44-a6f53816cdae","Type":"ContainerStarted","Data":"e885b1e0db5d414dce376939bf5865f09c34fa5aa362130297227f704db3ac75"} Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.979666 4717 generic.go:334] "Generic (PLEG): container finished" podID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" containerID="7ec9aca08d8dd4a2a00430f320c52fee44835265cc02485a134fda7634f93c2e" exitCode=0 Feb 17 15:10:19 crc kubenswrapper[4717]: I0217 15:10:19.983020 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-g7mg8" event={"ID":"29c6b382-3315-4904-8fd9-dc7f1c993c2b","Type":"ContainerDied","Data":"7ec9aca08d8dd4a2a00430f320c52fee44835265cc02485a134fda7634f93c2e"} Feb 17 15:10:20 crc kubenswrapper[4717]: I0217 15:10:20.993715 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tcpcx" event={"ID":"5b3786e2-123b-4300-86fb-505483139531","Type":"ContainerStarted","Data":"5ca72881a3fd4b3fd84db74172df8d5fdd228172442a767e1fa068eb71eae143"} Feb 17 15:10:20 crc kubenswrapper[4717]: I0217 15:10:20.995406 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e750-account-create-update-kv9rx" event={"ID":"9527cb55-cce6-4fb4-aa5c-b2495a6264f6","Type":"ContainerStarted","Data":"94173b96dd1944624e07d77926d4d98a3c750542042e65d34c958c10f0b69bf5"} Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.029178 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-h4rqv"] Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.030587 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h4rqv" Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.032877 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.037715 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6jvhf"] Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.045870 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-6jvhf"] Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.052054 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-h4rqv"] Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.192153 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32d66078-a470-4a86-92f9-b22752be660a-operator-scripts\") pod \"root-account-create-update-h4rqv\" (UID: \"32d66078-a470-4a86-92f9-b22752be660a\") " pod="openstack/root-account-create-update-h4rqv" Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.192262 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnrs2\" (UniqueName: \"kubernetes.io/projected/32d66078-a470-4a86-92f9-b22752be660a-kube-api-access-cnrs2\") pod \"root-account-create-update-h4rqv\" (UID: \"32d66078-a470-4a86-92f9-b22752be660a\") " pod="openstack/root-account-create-update-h4rqv" Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.293832 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32d66078-a470-4a86-92f9-b22752be660a-operator-scripts\") pod \"root-account-create-update-h4rqv\" (UID: \"32d66078-a470-4a86-92f9-b22752be660a\") " pod="openstack/root-account-create-update-h4rqv" Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.294132 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnrs2\" (UniqueName: \"kubernetes.io/projected/32d66078-a470-4a86-92f9-b22752be660a-kube-api-access-cnrs2\") pod \"root-account-create-update-h4rqv\" (UID: \"32d66078-a470-4a86-92f9-b22752be660a\") " pod="openstack/root-account-create-update-h4rqv" Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.294848 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32d66078-a470-4a86-92f9-b22752be660a-operator-scripts\") pod \"root-account-create-update-h4rqv\" (UID: \"32d66078-a470-4a86-92f9-b22752be660a\") " pod="openstack/root-account-create-update-h4rqv" Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.330875 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnrs2\" (UniqueName: \"kubernetes.io/projected/32d66078-a470-4a86-92f9-b22752be660a-kube-api-access-cnrs2\") pod \"root-account-create-update-h4rqv\" (UID: \"32d66078-a470-4a86-92f9-b22752be660a\") " pod="openstack/root-account-create-update-h4rqv" Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.409554 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h4rqv" Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.503477 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:21 crc kubenswrapper[4717]: E0217 15:10:21.505269 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:10:21 crc kubenswrapper[4717]: E0217 15:10:21.505325 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:10:21 crc kubenswrapper[4717]: E0217 15:10:21.505409 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift podName:518c6b94-614f-42fd-9016-122cdcfcb8c9 nodeName:}" failed. No retries permitted until 2026-02-17 15:10:25.50538138 +0000 UTC m=+1091.921221896 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift") pod "swift-storage-0" (UID: "518c6b94-614f-42fd-9016-122cdcfcb8c9") : configmap "swift-ring-files" not found Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.857840 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad94a78f-0c92-4603-863d-f4c50c946028" path="/var/lib/kubelet/pods/ad94a78f-0c92-4603-863d-f4c50c946028/volumes" Feb 17 15:10:21 crc kubenswrapper[4717]: I0217 15:10:21.928707 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-h4rqv"] Feb 17 15:10:21 crc kubenswrapper[4717]: W0217 15:10:21.933292 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32d66078_a470_4a86_92f9_b22752be660a.slice/crio-0553b1138111beb9e229291907715a2966044eb335a15f5023435b533bccf0e2 WatchSource:0}: Error finding container 0553b1138111beb9e229291907715a2966044eb335a15f5023435b533bccf0e2: Status 404 returned error can't find the container with id 0553b1138111beb9e229291907715a2966044eb335a15f5023435b533bccf0e2 Feb 17 15:10:22 crc kubenswrapper[4717]: I0217 15:10:22.006267 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h4rqv" event={"ID":"32d66078-a470-4a86-92f9-b22752be660a","Type":"ContainerStarted","Data":"0553b1138111beb9e229291907715a2966044eb335a15f5023435b533bccf0e2"} Feb 17 15:10:22 crc kubenswrapper[4717]: I0217 15:10:22.010047 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-g7mg8" event={"ID":"29c6b382-3315-4904-8fd9-dc7f1c993c2b","Type":"ContainerStarted","Data":"15cc8d81e9ea799a7d93cb54ecc804873c28505e89516613adffc06a18ed205f"} Feb 17 15:10:22 crc kubenswrapper[4717]: I0217 15:10:22.010970 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:22 crc kubenswrapper[4717]: I0217 15:10:22.025400 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-tcpcx" podStartSLOduration=4.025381493 podStartE2EDuration="4.025381493s" podCreationTimestamp="2026-02-17 15:10:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:22.023552061 +0000 UTC m=+1088.439392557" watchObservedRunningTime="2026-02-17 15:10:22.025381493 +0000 UTC m=+1088.441221969" Feb 17 15:10:22 crc kubenswrapper[4717]: I0217 15:10:22.044246 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-e750-account-create-update-kv9rx" podStartSLOduration=3.044229079 podStartE2EDuration="3.044229079s" podCreationTimestamp="2026-02-17 15:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:22.036065617 +0000 UTC m=+1088.451906093" watchObservedRunningTime="2026-02-17 15:10:22.044229079 +0000 UTC m=+1088.460069555" Feb 17 15:10:22 crc kubenswrapper[4717]: I0217 15:10:22.057580 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-g7mg8" podStartSLOduration=6.057562119 podStartE2EDuration="6.057562119s" podCreationTimestamp="2026-02-17 15:10:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:22.052929217 +0000 UTC m=+1088.468769693" watchObservedRunningTime="2026-02-17 15:10:22.057562119 +0000 UTC m=+1088.473402585" Feb 17 15:10:23 crc kubenswrapper[4717]: I0217 15:10:23.022680 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h4rqv" event={"ID":"32d66078-a470-4a86-92f9-b22752be660a","Type":"ContainerStarted","Data":"35d412548ce2232ec5c975027c9be64cea9e54fa413ebf26c9ba7a6b2ad5358f"} Feb 17 15:10:23 crc kubenswrapper[4717]: I0217 15:10:23.043290 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-h4rqv" podStartSLOduration=2.04326764 podStartE2EDuration="2.04326764s" podCreationTimestamp="2026-02-17 15:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:23.039488352 +0000 UTC m=+1089.455328848" watchObservedRunningTime="2026-02-17 15:10:23.04326764 +0000 UTC m=+1089.459108116" Feb 17 15:10:24 crc kubenswrapper[4717]: I0217 15:10:24.034792 4717 generic.go:334] "Generic (PLEG): container finished" podID="c92c72e4-581d-4c2d-939b-7a7403142041" containerID="d8c75d7b87cd1ef5c0b5186198f05269382351e290640d8dc86e9ecc80ebdee6" exitCode=0 Feb 17 15:10:24 crc kubenswrapper[4717]: I0217 15:10:24.034897 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fc2mw" event={"ID":"c92c72e4-581d-4c2d-939b-7a7403142041","Type":"ContainerDied","Data":"d8c75d7b87cd1ef5c0b5186198f05269382351e290640d8dc86e9ecc80ebdee6"} Feb 17 15:10:25 crc kubenswrapper[4717]: I0217 15:10:25.483192 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fc2mw" Feb 17 15:10:25 crc kubenswrapper[4717]: I0217 15:10:25.596127 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p99qp\" (UniqueName: \"kubernetes.io/projected/c92c72e4-581d-4c2d-939b-7a7403142041-kube-api-access-p99qp\") pod \"c92c72e4-581d-4c2d-939b-7a7403142041\" (UID: \"c92c72e4-581d-4c2d-939b-7a7403142041\") " Feb 17 15:10:25 crc kubenswrapper[4717]: I0217 15:10:25.596572 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92c72e4-581d-4c2d-939b-7a7403142041-operator-scripts\") pod \"c92c72e4-581d-4c2d-939b-7a7403142041\" (UID: \"c92c72e4-581d-4c2d-939b-7a7403142041\") " Feb 17 15:10:25 crc kubenswrapper[4717]: I0217 15:10:25.596908 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:25 crc kubenswrapper[4717]: E0217 15:10:25.597141 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:10:25 crc kubenswrapper[4717]: E0217 15:10:25.597160 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:10:25 crc kubenswrapper[4717]: E0217 15:10:25.597208 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift podName:518c6b94-614f-42fd-9016-122cdcfcb8c9 nodeName:}" failed. No retries permitted until 2026-02-17 15:10:33.597190505 +0000 UTC m=+1100.013030981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift") pod "swift-storage-0" (UID: "518c6b94-614f-42fd-9016-122cdcfcb8c9") : configmap "swift-ring-files" not found Feb 17 15:10:25 crc kubenswrapper[4717]: I0217 15:10:25.599178 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c92c72e4-581d-4c2d-939b-7a7403142041-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c92c72e4-581d-4c2d-939b-7a7403142041" (UID: "c92c72e4-581d-4c2d-939b-7a7403142041"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:25 crc kubenswrapper[4717]: I0217 15:10:25.607495 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c92c72e4-581d-4c2d-939b-7a7403142041-kube-api-access-p99qp" (OuterVolumeSpecName: "kube-api-access-p99qp") pod "c92c72e4-581d-4c2d-939b-7a7403142041" (UID: "c92c72e4-581d-4c2d-939b-7a7403142041"). InnerVolumeSpecName "kube-api-access-p99qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:25 crc kubenswrapper[4717]: I0217 15:10:25.699212 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p99qp\" (UniqueName: \"kubernetes.io/projected/c92c72e4-581d-4c2d-939b-7a7403142041-kube-api-access-p99qp\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:25 crc kubenswrapper[4717]: I0217 15:10:25.699251 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92c72e4-581d-4c2d-939b-7a7403142041-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:26 crc kubenswrapper[4717]: I0217 15:10:26.060794 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fc2mw" event={"ID":"c92c72e4-581d-4c2d-939b-7a7403142041","Type":"ContainerDied","Data":"74ca1cb943e120db201565e99819b51637c4835605225a23a50475f549ce79e8"} Feb 17 15:10:26 crc kubenswrapper[4717]: I0217 15:10:26.060855 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74ca1cb943e120db201565e99819b51637c4835605225a23a50475f549ce79e8" Feb 17 15:10:26 crc kubenswrapper[4717]: I0217 15:10:26.060880 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fc2mw" Feb 17 15:10:26 crc kubenswrapper[4717]: I0217 15:10:26.814324 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:10:26 crc kubenswrapper[4717]: I0217 15:10:26.898056 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-z55jw"] Feb 17 15:10:26 crc kubenswrapper[4717]: I0217 15:10:26.898341 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" podUID="d3f84dac-d50e-4366-921b-e10d201ce421" containerName="dnsmasq-dns" containerID="cri-o://85eacc059669c5b39c9c461429b4070c4c5bafba58fbb5dc845db03726385662" gracePeriod=10 Feb 17 15:10:28 crc kubenswrapper[4717]: I0217 15:10:28.079807 4717 generic.go:334] "Generic (PLEG): container finished" podID="d3f84dac-d50e-4366-921b-e10d201ce421" containerID="85eacc059669c5b39c9c461429b4070c4c5bafba58fbb5dc845db03726385662" exitCode=0 Feb 17 15:10:28 crc kubenswrapper[4717]: I0217 15:10:28.080023 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" event={"ID":"d3f84dac-d50e-4366-921b-e10d201ce421","Type":"ContainerDied","Data":"85eacc059669c5b39c9c461429b4070c4c5bafba58fbb5dc845db03726385662"} Feb 17 15:10:28 crc kubenswrapper[4717]: I0217 15:10:28.551607 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 17 15:10:29 crc kubenswrapper[4717]: I0217 15:10:29.089113 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" podUID="d3f84dac-d50e-4366-921b-e10d201ce421" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: connect: connection refused" Feb 17 15:10:30 crc kubenswrapper[4717]: I0217 15:10:30.096957 4717 generic.go:334] "Generic (PLEG): container finished" podID="76fdb52f-cf38-4b2d-8997-ebcb0395e6ea" containerID="5a2e52a7881ad868f34911f20aaa2de38d025d14a3689528ca22d9d37e1c0e1f" exitCode=0 Feb 17 15:10:30 crc kubenswrapper[4717]: I0217 15:10:30.096993 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-fxcxg" event={"ID":"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea","Type":"ContainerDied","Data":"5a2e52a7881ad868f34911f20aaa2de38d025d14a3689528ca22d9d37e1c0e1f"} Feb 17 15:10:33 crc kubenswrapper[4717]: I0217 15:10:33.646004 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:33 crc kubenswrapper[4717]: E0217 15:10:33.646263 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:10:33 crc kubenswrapper[4717]: E0217 15:10:33.646689 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:10:33 crc kubenswrapper[4717]: E0217 15:10:33.646748 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift podName:518c6b94-614f-42fd-9016-122cdcfcb8c9 nodeName:}" failed. No retries permitted until 2026-02-17 15:10:49.646733332 +0000 UTC m=+1116.062573808 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift") pod "swift-storage-0" (UID: "518c6b94-614f-42fd-9016-122cdcfcb8c9") : configmap "swift-ring-files" not found Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.265759 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-k4zc7" podUID="edd6eb53-55b7-4a61-867c-e4bf277af963" containerName="ovn-controller" probeResult="failure" output=< Feb 17 15:10:35 crc kubenswrapper[4717]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 15:10:35 crc kubenswrapper[4717]: > Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.270094 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.296539 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5r2q4" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.453287 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.458441 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-fxcxg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.558394 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-k4zc7-config-26vvg"] Feb 17 15:10:35 crc kubenswrapper[4717]: E0217 15:10:35.558808 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f84dac-d50e-4366-921b-e10d201ce421" containerName="dnsmasq-dns" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.558828 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f84dac-d50e-4366-921b-e10d201ce421" containerName="dnsmasq-dns" Feb 17 15:10:35 crc kubenswrapper[4717]: E0217 15:10:35.558853 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c92c72e4-581d-4c2d-939b-7a7403142041" containerName="mariadb-database-create" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.558863 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92c72e4-581d-4c2d-939b-7a7403142041" containerName="mariadb-database-create" Feb 17 15:10:35 crc kubenswrapper[4717]: E0217 15:10:35.558886 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f84dac-d50e-4366-921b-e10d201ce421" containerName="init" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.558893 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f84dac-d50e-4366-921b-e10d201ce421" containerName="init" Feb 17 15:10:35 crc kubenswrapper[4717]: E0217 15:10:35.558925 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fdb52f-cf38-4b2d-8997-ebcb0395e6ea" containerName="mariadb-database-create" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.558932 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fdb52f-cf38-4b2d-8997-ebcb0395e6ea" containerName="mariadb-database-create" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.559125 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f84dac-d50e-4366-921b-e10d201ce421" containerName="dnsmasq-dns" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.559153 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="76fdb52f-cf38-4b2d-8997-ebcb0395e6ea" containerName="mariadb-database-create" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.559165 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c92c72e4-581d-4c2d-939b-7a7403142041" containerName="mariadb-database-create" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.559784 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.564184 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.570060 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k4zc7-config-26vvg"] Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.588053 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-dns-svc\") pod \"d3f84dac-d50e-4366-921b-e10d201ce421\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.588133 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea-operator-scripts\") pod \"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea\" (UID: \"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea\") " Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.588189 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsm8v\" (UniqueName: \"kubernetes.io/projected/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea-kube-api-access-bsm8v\") pod \"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea\" (UID: \"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea\") " Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.588215 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z4st\" (UniqueName: \"kubernetes.io/projected/d3f84dac-d50e-4366-921b-e10d201ce421-kube-api-access-6z4st\") pod \"d3f84dac-d50e-4366-921b-e10d201ce421\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.588279 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-ovsdbserver-sb\") pod \"d3f84dac-d50e-4366-921b-e10d201ce421\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.588301 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-ovsdbserver-nb\") pod \"d3f84dac-d50e-4366-921b-e10d201ce421\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.588325 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-config\") pod \"d3f84dac-d50e-4366-921b-e10d201ce421\" (UID: \"d3f84dac-d50e-4366-921b-e10d201ce421\") " Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.602490 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea-kube-api-access-bsm8v" (OuterVolumeSpecName: "kube-api-access-bsm8v") pod "76fdb52f-cf38-4b2d-8997-ebcb0395e6ea" (UID: "76fdb52f-cf38-4b2d-8997-ebcb0395e6ea"). InnerVolumeSpecName "kube-api-access-bsm8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.606345 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76fdb52f-cf38-4b2d-8997-ebcb0395e6ea" (UID: "76fdb52f-cf38-4b2d-8997-ebcb0395e6ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.617020 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f84dac-d50e-4366-921b-e10d201ce421-kube-api-access-6z4st" (OuterVolumeSpecName: "kube-api-access-6z4st") pod "d3f84dac-d50e-4366-921b-e10d201ce421" (UID: "d3f84dac-d50e-4366-921b-e10d201ce421"). InnerVolumeSpecName "kube-api-access-6z4st". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.649659 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d3f84dac-d50e-4366-921b-e10d201ce421" (UID: "d3f84dac-d50e-4366-921b-e10d201ce421"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.659388 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d3f84dac-d50e-4366-921b-e10d201ce421" (UID: "d3f84dac-d50e-4366-921b-e10d201ce421"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.675818 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-config" (OuterVolumeSpecName: "config") pod "d3f84dac-d50e-4366-921b-e10d201ce421" (UID: "d3f84dac-d50e-4366-921b-e10d201ce421"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.681452 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d3f84dac-d50e-4366-921b-e10d201ce421" (UID: "d3f84dac-d50e-4366-921b-e10d201ce421"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691316 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-run\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691379 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-run-ovn\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691405 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-log-ovn\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691444 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a4229e-485b-43ed-80aa-a9f3212a132c-scripts\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691541 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjzh5\" (UniqueName: \"kubernetes.io/projected/04a4229e-485b-43ed-80aa-a9f3212a132c-kube-api-access-gjzh5\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691597 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04a4229e-485b-43ed-80aa-a9f3212a132c-additional-scripts\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691678 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691692 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691705 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsm8v\" (UniqueName: \"kubernetes.io/projected/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea-kube-api-access-bsm8v\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691718 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z4st\" (UniqueName: \"kubernetes.io/projected/d3f84dac-d50e-4366-921b-e10d201ce421-kube-api-access-6z4st\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691730 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691740 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.691750 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f84dac-d50e-4366-921b-e10d201ce421-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.793687 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-run\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.794019 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-run-ovn\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.794037 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-log-ovn\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.793976 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-run\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.794133 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a4229e-485b-43ed-80aa-a9f3212a132c-scripts\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.794203 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-log-ovn\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.794229 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-run-ovn\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.796599 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a4229e-485b-43ed-80aa-a9f3212a132c-scripts\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.796650 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjzh5\" (UniqueName: \"kubernetes.io/projected/04a4229e-485b-43ed-80aa-a9f3212a132c-kube-api-access-gjzh5\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.796713 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04a4229e-485b-43ed-80aa-a9f3212a132c-additional-scripts\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.797251 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04a4229e-485b-43ed-80aa-a9f3212a132c-additional-scripts\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.815307 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjzh5\" (UniqueName: \"kubernetes.io/projected/04a4229e-485b-43ed-80aa-a9f3212a132c-kube-api-access-gjzh5\") pod \"ovn-controller-k4zc7-config-26vvg\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:35 crc kubenswrapper[4717]: I0217 15:10:35.907424 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.144207 4717 generic.go:334] "Generic (PLEG): container finished" podID="8924cebf-3c79-4978-9564-ec8869b9d79a" containerID="9d4afd969ca60da733acf3772d99a5fb8c4614efb5bf99795644d5b1b294843f" exitCode=0 Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.144270 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8924cebf-3c79-4978-9564-ec8869b9d79a","Type":"ContainerDied","Data":"9d4afd969ca60da733acf3772d99a5fb8c4614efb5bf99795644d5b1b294843f"} Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.149714 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-fxcxg" event={"ID":"76fdb52f-cf38-4b2d-8997-ebcb0395e6ea","Type":"ContainerDied","Data":"8698e5a00919ce102aaa6ccbffe87cd39b0f1397878abdc02a13fd8be3c2a54c"} Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.149752 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8698e5a00919ce102aaa6ccbffe87cd39b0f1397878abdc02a13fd8be3c2a54c" Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.149816 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-fxcxg" Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.158286 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.158869 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" event={"ID":"d3f84dac-d50e-4366-921b-e10d201ce421","Type":"ContainerDied","Data":"8ddec66381fc42ae1a94fea56dbd45835719f745c7f97d166406ab6c470fed70"} Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.158935 4717 scope.go:117] "RemoveContainer" containerID="85eacc059669c5b39c9c461429b4070c4c5bafba58fbb5dc845db03726385662" Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.161461 4717 generic.go:334] "Generic (PLEG): container finished" podID="0eb38f44-bed1-4e65-8de2-9624715baee1" containerID="9742265cb034d254215ebd27d4a013d407abf002fdfd0518586975bbc57eed7b" exitCode=0 Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.161544 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0eb38f44-bed1-4e65-8de2-9624715baee1","Type":"ContainerDied","Data":"9742265cb034d254215ebd27d4a013d407abf002fdfd0518586975bbc57eed7b"} Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.193358 4717 scope.go:117] "RemoveContainer" containerID="3903b21e269cd99d14a9e15f3f9fc79b7c9aef40152a2d39ca1aff3171d678dd" Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.226205 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-z55jw"] Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.237244 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-z55jw"] Feb 17 15:10:36 crc kubenswrapper[4717]: E0217 15:10:36.347591 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified" Feb 17 15:10:36 crc kubenswrapper[4717]: E0217 15:10:36.347937 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:swift-ring-rebalance,Image:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,Command:[/usr/local/bin/swift-ring-tool all],Args:[],WorkingDir:/etc/swift,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CM_NAME,Value:swift-ring-files,ValueFrom:nil,},EnvVar{Name:NAMESPACE,Value:openstack,ValueFrom:nil,},EnvVar{Name:OWNER_APIVERSION,Value:swift.openstack.org/v1beta1,ValueFrom:nil,},EnvVar{Name:OWNER_KIND,Value:SwiftRing,ValueFrom:nil,},EnvVar{Name:OWNER_NAME,Value:swift-ring,ValueFrom:nil,},EnvVar{Name:OWNER_UID,Value:5f8fe17d-0a86-4de8-8e0b-9d407672af02,ValueFrom:nil,},EnvVar{Name:SWIFT_MIN_PART_HOURS,Value:1,ValueFrom:nil,},EnvVar{Name:SWIFT_PART_POWER,Value:10,ValueFrom:nil,},EnvVar{Name:SWIFT_REPLICAS,Value:1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/swift-ring-tool,SubPath:swift-ring-tool,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:swiftconf,ReadOnly:true,MountPath:/etc/swift/swift.conf,SubPath:swift.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-swift,ReadOnly:false,MountPath:/etc/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ring-data-devices,ReadOnly:true,MountPath:/var/lib/config-data/ring-devices,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dispersionconf,ReadOnly:true,MountPath:/etc/swift/dispersion.conf,SubPath:dispersion.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-svjk8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42445,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-ring-rebalance-rhzj6_openstack(929feb4c-fd82-4293-9a44-a6f53816cdae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:10:36 crc kubenswrapper[4717]: I0217 15:10:36.348265 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k4zc7-config-26vvg"] Feb 17 15:10:36 crc kubenswrapper[4717]: E0217 15:10:36.349285 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"swift-ring-rebalance\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/swift-ring-rebalance-rhzj6" podUID="929feb4c-fd82-4293-9a44-a6f53816cdae" Feb 17 15:10:36 crc kubenswrapper[4717]: W0217 15:10:36.356327 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04a4229e_485b_43ed_80aa_a9f3212a132c.slice/crio-833b95a744cb93a578fd2c2ad0a3574f2063beab3f656fe5e939cd6706a5d499 WatchSource:0}: Error finding container 833b95a744cb93a578fd2c2ad0a3574f2063beab3f656fe5e939cd6706a5d499: Status 404 returned error can't find the container with id 833b95a744cb93a578fd2c2ad0a3574f2063beab3f656fe5e939cd6706a5d499 Feb 17 15:10:37 crc kubenswrapper[4717]: I0217 15:10:37.169504 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k4zc7-config-26vvg" event={"ID":"04a4229e-485b-43ed-80aa-a9f3212a132c","Type":"ContainerStarted","Data":"a580cee00e166254db5557e0ef7b3d2e3f6757cb5754a1391c64c4a5c7dbe9a4"} Feb 17 15:10:37 crc kubenswrapper[4717]: I0217 15:10:37.169543 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k4zc7-config-26vvg" event={"ID":"04a4229e-485b-43ed-80aa-a9f3212a132c","Type":"ContainerStarted","Data":"833b95a744cb93a578fd2c2ad0a3574f2063beab3f656fe5e939cd6706a5d499"} Feb 17 15:10:37 crc kubenswrapper[4717]: I0217 15:10:37.171720 4717 generic.go:334] "Generic (PLEG): container finished" podID="5b3786e2-123b-4300-86fb-505483139531" containerID="5ca72881a3fd4b3fd84db74172df8d5fdd228172442a767e1fa068eb71eae143" exitCode=0 Feb 17 15:10:37 crc kubenswrapper[4717]: I0217 15:10:37.171769 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tcpcx" event={"ID":"5b3786e2-123b-4300-86fb-505483139531","Type":"ContainerDied","Data":"5ca72881a3fd4b3fd84db74172df8d5fdd228172442a767e1fa068eb71eae143"} Feb 17 15:10:37 crc kubenswrapper[4717]: I0217 15:10:37.175160 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0eb38f44-bed1-4e65-8de2-9624715baee1","Type":"ContainerStarted","Data":"2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612"} Feb 17 15:10:37 crc kubenswrapper[4717]: I0217 15:10:37.175362 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:10:37 crc kubenswrapper[4717]: I0217 15:10:37.177670 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8924cebf-3c79-4978-9564-ec8869b9d79a","Type":"ContainerStarted","Data":"2c1aa0b270a00ffe95fae98442ccacfdc668667202ea12c65f80fb8f3530f8f9"} Feb 17 15:10:37 crc kubenswrapper[4717]: I0217 15:10:37.178248 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 15:10:37 crc kubenswrapper[4717]: E0217 15:10:37.179148 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"swift-ring-rebalance\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified\\\"\"" pod="openstack/swift-ring-rebalance-rhzj6" podUID="929feb4c-fd82-4293-9a44-a6f53816cdae" Feb 17 15:10:37 crc kubenswrapper[4717]: I0217 15:10:37.216227 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.710235597 podStartE2EDuration="1m8.216212676s" podCreationTimestamp="2026-02-17 15:09:29 +0000 UTC" firstStartedPulling="2026-02-17 15:09:31.804378626 +0000 UTC m=+1038.220219102" lastFinishedPulling="2026-02-17 15:10:01.310355705 +0000 UTC m=+1067.726196181" observedRunningTime="2026-02-17 15:10:37.207777106 +0000 UTC m=+1103.623617602" watchObservedRunningTime="2026-02-17 15:10:37.216212676 +0000 UTC m=+1103.632053152" Feb 17 15:10:37 crc kubenswrapper[4717]: I0217 15:10:37.231901 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.089320294 podStartE2EDuration="1m8.231881033s" podCreationTimestamp="2026-02-17 15:09:29 +0000 UTC" firstStartedPulling="2026-02-17 15:09:31.542732492 +0000 UTC m=+1037.958572968" lastFinishedPulling="2026-02-17 15:10:01.685293231 +0000 UTC m=+1068.101133707" observedRunningTime="2026-02-17 15:10:37.229846715 +0000 UTC m=+1103.645687211" watchObservedRunningTime="2026-02-17 15:10:37.231881033 +0000 UTC m=+1103.647721509" Feb 17 15:10:37 crc kubenswrapper[4717]: I0217 15:10:37.857269 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f84dac-d50e-4366-921b-e10d201ce421" path="/var/lib/kubelet/pods/d3f84dac-d50e-4366-921b-e10d201ce421/volumes" Feb 17 15:10:38 crc kubenswrapper[4717]: I0217 15:10:38.187414 4717 generic.go:334] "Generic (PLEG): container finished" podID="04a4229e-485b-43ed-80aa-a9f3212a132c" containerID="a580cee00e166254db5557e0ef7b3d2e3f6757cb5754a1391c64c4a5c7dbe9a4" exitCode=0 Feb 17 15:10:38 crc kubenswrapper[4717]: I0217 15:10:38.189245 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k4zc7-config-26vvg" event={"ID":"04a4229e-485b-43ed-80aa-a9f3212a132c","Type":"ContainerDied","Data":"a580cee00e166254db5557e0ef7b3d2e3f6757cb5754a1391c64c4a5c7dbe9a4"} Feb 17 15:10:38 crc kubenswrapper[4717]: I0217 15:10:38.576422 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tcpcx" Feb 17 15:10:38 crc kubenswrapper[4717]: I0217 15:10:38.650617 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rtht\" (UniqueName: \"kubernetes.io/projected/5b3786e2-123b-4300-86fb-505483139531-kube-api-access-2rtht\") pod \"5b3786e2-123b-4300-86fb-505483139531\" (UID: \"5b3786e2-123b-4300-86fb-505483139531\") " Feb 17 15:10:38 crc kubenswrapper[4717]: I0217 15:10:38.650852 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3786e2-123b-4300-86fb-505483139531-operator-scripts\") pod \"5b3786e2-123b-4300-86fb-505483139531\" (UID: \"5b3786e2-123b-4300-86fb-505483139531\") " Feb 17 15:10:38 crc kubenswrapper[4717]: I0217 15:10:38.652196 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3786e2-123b-4300-86fb-505483139531-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b3786e2-123b-4300-86fb-505483139531" (UID: "5b3786e2-123b-4300-86fb-505483139531"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:38 crc kubenswrapper[4717]: I0217 15:10:38.657324 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3786e2-123b-4300-86fb-505483139531-kube-api-access-2rtht" (OuterVolumeSpecName: "kube-api-access-2rtht") pod "5b3786e2-123b-4300-86fb-505483139531" (UID: "5b3786e2-123b-4300-86fb-505483139531"). InnerVolumeSpecName "kube-api-access-2rtht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:38 crc kubenswrapper[4717]: I0217 15:10:38.752860 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rtht\" (UniqueName: \"kubernetes.io/projected/5b3786e2-123b-4300-86fb-505483139531-kube-api-access-2rtht\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:38 crc kubenswrapper[4717]: I0217 15:10:38.752908 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3786e2-123b-4300-86fb-505483139531-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.090064 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-z55jw" podUID="d3f84dac-d50e-4366-921b-e10d201ce421" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: i/o timeout" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.199255 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tcpcx" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.210302 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tcpcx" event={"ID":"5b3786e2-123b-4300-86fb-505483139531","Type":"ContainerDied","Data":"2ab468ab633c9dccf0c1c273c22a02be6d2a14b51d33389b69a725dc59dec969"} Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.210417 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ab468ab633c9dccf0c1c273c22a02be6d2a14b51d33389b69a725dc59dec969" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.606921 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.671172 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-log-ovn\") pod \"04a4229e-485b-43ed-80aa-a9f3212a132c\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.671261 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-run\") pod \"04a4229e-485b-43ed-80aa-a9f3212a132c\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.671320 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "04a4229e-485b-43ed-80aa-a9f3212a132c" (UID: "04a4229e-485b-43ed-80aa-a9f3212a132c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.671375 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04a4229e-485b-43ed-80aa-a9f3212a132c-additional-scripts\") pod \"04a4229e-485b-43ed-80aa-a9f3212a132c\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.671413 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a4229e-485b-43ed-80aa-a9f3212a132c-scripts\") pod \"04a4229e-485b-43ed-80aa-a9f3212a132c\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.671440 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-run-ovn\") pod \"04a4229e-485b-43ed-80aa-a9f3212a132c\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.671453 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-run" (OuterVolumeSpecName: "var-run") pod "04a4229e-485b-43ed-80aa-a9f3212a132c" (UID: "04a4229e-485b-43ed-80aa-a9f3212a132c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.671473 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjzh5\" (UniqueName: \"kubernetes.io/projected/04a4229e-485b-43ed-80aa-a9f3212a132c-kube-api-access-gjzh5\") pod \"04a4229e-485b-43ed-80aa-a9f3212a132c\" (UID: \"04a4229e-485b-43ed-80aa-a9f3212a132c\") " Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.671928 4717 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.671944 4717 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.672561 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a4229e-485b-43ed-80aa-a9f3212a132c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "04a4229e-485b-43ed-80aa-a9f3212a132c" (UID: "04a4229e-485b-43ed-80aa-a9f3212a132c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.672704 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "04a4229e-485b-43ed-80aa-a9f3212a132c" (UID: "04a4229e-485b-43ed-80aa-a9f3212a132c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.672784 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a4229e-485b-43ed-80aa-a9f3212a132c-scripts" (OuterVolumeSpecName: "scripts") pod "04a4229e-485b-43ed-80aa-a9f3212a132c" (UID: "04a4229e-485b-43ed-80aa-a9f3212a132c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.675192 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a4229e-485b-43ed-80aa-a9f3212a132c-kube-api-access-gjzh5" (OuterVolumeSpecName: "kube-api-access-gjzh5") pod "04a4229e-485b-43ed-80aa-a9f3212a132c" (UID: "04a4229e-485b-43ed-80aa-a9f3212a132c"). InnerVolumeSpecName "kube-api-access-gjzh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.773339 4717 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04a4229e-485b-43ed-80aa-a9f3212a132c-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.773368 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a4229e-485b-43ed-80aa-a9f3212a132c-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.773377 4717 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04a4229e-485b-43ed-80aa-a9f3212a132c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:39 crc kubenswrapper[4717]: I0217 15:10:39.773387 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjzh5\" (UniqueName: \"kubernetes.io/projected/04a4229e-485b-43ed-80aa-a9f3212a132c-kube-api-access-gjzh5\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:40 crc kubenswrapper[4717]: I0217 15:10:40.212726 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k4zc7-config-26vvg" event={"ID":"04a4229e-485b-43ed-80aa-a9f3212a132c","Type":"ContainerDied","Data":"833b95a744cb93a578fd2c2ad0a3574f2063beab3f656fe5e939cd6706a5d499"} Feb 17 15:10:40 crc kubenswrapper[4717]: I0217 15:10:40.212764 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="833b95a744cb93a578fd2c2ad0a3574f2063beab3f656fe5e939cd6706a5d499" Feb 17 15:10:40 crc kubenswrapper[4717]: I0217 15:10:40.212815 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k4zc7-config-26vvg" Feb 17 15:10:40 crc kubenswrapper[4717]: I0217 15:10:40.249867 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-k4zc7" Feb 17 15:10:40 crc kubenswrapper[4717]: I0217 15:10:40.703098 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-k4zc7-config-26vvg"] Feb 17 15:10:40 crc kubenswrapper[4717]: I0217 15:10:40.710470 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-k4zc7-config-26vvg"] Feb 17 15:10:41 crc kubenswrapper[4717]: I0217 15:10:41.873357 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a4229e-485b-43ed-80aa-a9f3212a132c" path="/var/lib/kubelet/pods/04a4229e-485b-43ed-80aa-a9f3212a132c/volumes" Feb 17 15:10:42 crc kubenswrapper[4717]: E0217 15:10:42.947376 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0e668f2_e8a2_403b_af75_9ea042dad451.slice/crio-17080c43a13839d51643cf8b58c0f9b6a44de0d4e29acd113c75e4b57e57a55b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32d66078_a470_4a86_92f9_b22752be660a.slice/crio-conmon-35d412548ce2232ec5c975027c9be64cea9e54fa413ebf26c9ba7a6b2ad5358f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0e668f2_e8a2_403b_af75_9ea042dad451.slice/crio-conmon-17080c43a13839d51643cf8b58c0f9b6a44de0d4e29acd113c75e4b57e57a55b.scope\": RecentStats: unable to find data in memory cache]" Feb 17 15:10:43 crc kubenswrapper[4717]: I0217 15:10:43.238226 4717 generic.go:334] "Generic (PLEG): container finished" podID="9527cb55-cce6-4fb4-aa5c-b2495a6264f6" containerID="94173b96dd1944624e07d77926d4d98a3c750542042e65d34c958c10f0b69bf5" exitCode=0 Feb 17 15:10:43 crc kubenswrapper[4717]: I0217 15:10:43.238330 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e750-account-create-update-kv9rx" event={"ID":"9527cb55-cce6-4fb4-aa5c-b2495a6264f6","Type":"ContainerDied","Data":"94173b96dd1944624e07d77926d4d98a3c750542042e65d34c958c10f0b69bf5"} Feb 17 15:10:43 crc kubenswrapper[4717]: I0217 15:10:43.240014 4717 generic.go:334] "Generic (PLEG): container finished" podID="c0e668f2-e8a2-403b-af75-9ea042dad451" containerID="17080c43a13839d51643cf8b58c0f9b6a44de0d4e29acd113c75e4b57e57a55b" exitCode=0 Feb 17 15:10:43 crc kubenswrapper[4717]: I0217 15:10:43.240149 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-468e-account-create-update-pf2dw" event={"ID":"c0e668f2-e8a2-403b-af75-9ea042dad451","Type":"ContainerDied","Data":"17080c43a13839d51643cf8b58c0f9b6a44de0d4e29acd113c75e4b57e57a55b"} Feb 17 15:10:43 crc kubenswrapper[4717]: I0217 15:10:43.241946 4717 generic.go:334] "Generic (PLEG): container finished" podID="32d66078-a470-4a86-92f9-b22752be660a" containerID="35d412548ce2232ec5c975027c9be64cea9e54fa413ebf26c9ba7a6b2ad5358f" exitCode=0 Feb 17 15:10:43 crc kubenswrapper[4717]: I0217 15:10:43.242007 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h4rqv" event={"ID":"32d66078-a470-4a86-92f9-b22752be660a","Type":"ContainerDied","Data":"35d412548ce2232ec5c975027c9be64cea9e54fa413ebf26c9ba7a6b2ad5358f"} Feb 17 15:10:43 crc kubenswrapper[4717]: I0217 15:10:43.243747 4717 generic.go:334] "Generic (PLEG): container finished" podID="4fb161a2-7b80-400b-a4eb-ca14e4386721" containerID="4c52b97c2f1f802fa203b77aa3e96251c4bfb84f3f0c1f41b12f110bdbce3bbb" exitCode=0 Feb 17 15:10:43 crc kubenswrapper[4717]: I0217 15:10:43.243778 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dfdd-account-create-update-gm7bt" event={"ID":"4fb161a2-7b80-400b-a4eb-ca14e4386721","Type":"ContainerDied","Data":"4c52b97c2f1f802fa203b77aa3e96251c4bfb84f3f0c1f41b12f110bdbce3bbb"} Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.582267 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e750-account-create-update-kv9rx" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.652317 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9527cb55-cce6-4fb4-aa5c-b2495a6264f6-operator-scripts\") pod \"9527cb55-cce6-4fb4-aa5c-b2495a6264f6\" (UID: \"9527cb55-cce6-4fb4-aa5c-b2495a6264f6\") " Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.652520 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9mxw\" (UniqueName: \"kubernetes.io/projected/9527cb55-cce6-4fb4-aa5c-b2495a6264f6-kube-api-access-l9mxw\") pod \"9527cb55-cce6-4fb4-aa5c-b2495a6264f6\" (UID: \"9527cb55-cce6-4fb4-aa5c-b2495a6264f6\") " Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.653708 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9527cb55-cce6-4fb4-aa5c-b2495a6264f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9527cb55-cce6-4fb4-aa5c-b2495a6264f6" (UID: "9527cb55-cce6-4fb4-aa5c-b2495a6264f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.671482 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9527cb55-cce6-4fb4-aa5c-b2495a6264f6-kube-api-access-l9mxw" (OuterVolumeSpecName: "kube-api-access-l9mxw") pod "9527cb55-cce6-4fb4-aa5c-b2495a6264f6" (UID: "9527cb55-cce6-4fb4-aa5c-b2495a6264f6"). InnerVolumeSpecName "kube-api-access-l9mxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.754361 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9mxw\" (UniqueName: \"kubernetes.io/projected/9527cb55-cce6-4fb4-aa5c-b2495a6264f6-kube-api-access-l9mxw\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.754412 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9527cb55-cce6-4fb4-aa5c-b2495a6264f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.837005 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dfdd-account-create-update-gm7bt" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.844498 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-468e-account-create-update-pf2dw" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.851223 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h4rqv" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.956494 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjf2h\" (UniqueName: \"kubernetes.io/projected/c0e668f2-e8a2-403b-af75-9ea042dad451-kube-api-access-hjf2h\") pod \"c0e668f2-e8a2-403b-af75-9ea042dad451\" (UID: \"c0e668f2-e8a2-403b-af75-9ea042dad451\") " Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.956558 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb161a2-7b80-400b-a4eb-ca14e4386721-operator-scripts\") pod \"4fb161a2-7b80-400b-a4eb-ca14e4386721\" (UID: \"4fb161a2-7b80-400b-a4eb-ca14e4386721\") " Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.956589 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8lcv\" (UniqueName: \"kubernetes.io/projected/4fb161a2-7b80-400b-a4eb-ca14e4386721-kube-api-access-k8lcv\") pod \"4fb161a2-7b80-400b-a4eb-ca14e4386721\" (UID: \"4fb161a2-7b80-400b-a4eb-ca14e4386721\") " Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.956638 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32d66078-a470-4a86-92f9-b22752be660a-operator-scripts\") pod \"32d66078-a470-4a86-92f9-b22752be660a\" (UID: \"32d66078-a470-4a86-92f9-b22752be660a\") " Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.956745 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnrs2\" (UniqueName: \"kubernetes.io/projected/32d66078-a470-4a86-92f9-b22752be660a-kube-api-access-cnrs2\") pod \"32d66078-a470-4a86-92f9-b22752be660a\" (UID: \"32d66078-a470-4a86-92f9-b22752be660a\") " Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.956769 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0e668f2-e8a2-403b-af75-9ea042dad451-operator-scripts\") pod \"c0e668f2-e8a2-403b-af75-9ea042dad451\" (UID: \"c0e668f2-e8a2-403b-af75-9ea042dad451\") " Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.957569 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32d66078-a470-4a86-92f9-b22752be660a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32d66078-a470-4a86-92f9-b22752be660a" (UID: "32d66078-a470-4a86-92f9-b22752be660a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.957973 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0e668f2-e8a2-403b-af75-9ea042dad451-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0e668f2-e8a2-403b-af75-9ea042dad451" (UID: "c0e668f2-e8a2-403b-af75-9ea042dad451"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.958277 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb161a2-7b80-400b-a4eb-ca14e4386721-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4fb161a2-7b80-400b-a4eb-ca14e4386721" (UID: "4fb161a2-7b80-400b-a4eb-ca14e4386721"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.961550 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb161a2-7b80-400b-a4eb-ca14e4386721-kube-api-access-k8lcv" (OuterVolumeSpecName: "kube-api-access-k8lcv") pod "4fb161a2-7b80-400b-a4eb-ca14e4386721" (UID: "4fb161a2-7b80-400b-a4eb-ca14e4386721"). InnerVolumeSpecName "kube-api-access-k8lcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.961806 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d66078-a470-4a86-92f9-b22752be660a-kube-api-access-cnrs2" (OuterVolumeSpecName: "kube-api-access-cnrs2") pod "32d66078-a470-4a86-92f9-b22752be660a" (UID: "32d66078-a470-4a86-92f9-b22752be660a"). InnerVolumeSpecName "kube-api-access-cnrs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:44 crc kubenswrapper[4717]: I0217 15:10:44.961841 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e668f2-e8a2-403b-af75-9ea042dad451-kube-api-access-hjf2h" (OuterVolumeSpecName: "kube-api-access-hjf2h") pod "c0e668f2-e8a2-403b-af75-9ea042dad451" (UID: "c0e668f2-e8a2-403b-af75-9ea042dad451"). InnerVolumeSpecName "kube-api-access-hjf2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.059440 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjf2h\" (UniqueName: \"kubernetes.io/projected/c0e668f2-e8a2-403b-af75-9ea042dad451-kube-api-access-hjf2h\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.059476 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb161a2-7b80-400b-a4eb-ca14e4386721-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.059486 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8lcv\" (UniqueName: \"kubernetes.io/projected/4fb161a2-7b80-400b-a4eb-ca14e4386721-kube-api-access-k8lcv\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.059497 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32d66078-a470-4a86-92f9-b22752be660a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.059506 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnrs2\" (UniqueName: \"kubernetes.io/projected/32d66078-a470-4a86-92f9-b22752be660a-kube-api-access-cnrs2\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.059516 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0e668f2-e8a2-403b-af75-9ea042dad451-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.261964 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-468e-account-create-update-pf2dw" event={"ID":"c0e668f2-e8a2-403b-af75-9ea042dad451","Type":"ContainerDied","Data":"ae4d3cc997cd2a2fb40c7829d926da0f0a28c5e6aec8919e7ab1937a949d636b"} Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.262010 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae4d3cc997cd2a2fb40c7829d926da0f0a28c5e6aec8919e7ab1937a949d636b" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.262023 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-468e-account-create-update-pf2dw" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.277775 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h4rqv" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.277774 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h4rqv" event={"ID":"32d66078-a470-4a86-92f9-b22752be660a","Type":"ContainerDied","Data":"0553b1138111beb9e229291907715a2966044eb335a15f5023435b533bccf0e2"} Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.277855 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0553b1138111beb9e229291907715a2966044eb335a15f5023435b533bccf0e2" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.280293 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e750-account-create-update-kv9rx" event={"ID":"9527cb55-cce6-4fb4-aa5c-b2495a6264f6","Type":"ContainerDied","Data":"678cee376887c8fdc352fecf72b05fa372b52f7f5421f7c94c16b989b13941af"} Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.280376 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="678cee376887c8fdc352fecf72b05fa372b52f7f5421f7c94c16b989b13941af" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.280619 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e750-account-create-update-kv9rx" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.281710 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dfdd-account-create-update-gm7bt" event={"ID":"4fb161a2-7b80-400b-a4eb-ca14e4386721","Type":"ContainerDied","Data":"d5eeed32fe017f446e2cf2fcb26ad6dc6007749f9db92d73dcff309130541f8b"} Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.281762 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5eeed32fe017f446e2cf2fcb26ad6dc6007749f9db92d73dcff309130541f8b" Feb 17 15:10:45 crc kubenswrapper[4717]: I0217 15:10:45.281837 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dfdd-account-create-update-gm7bt" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.233441 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hvpfc"] Feb 17 15:10:49 crc kubenswrapper[4717]: E0217 15:10:49.234643 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9527cb55-cce6-4fb4-aa5c-b2495a6264f6" containerName="mariadb-account-create-update" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.234675 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9527cb55-cce6-4fb4-aa5c-b2495a6264f6" containerName="mariadb-account-create-update" Feb 17 15:10:49 crc kubenswrapper[4717]: E0217 15:10:49.234700 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3786e2-123b-4300-86fb-505483139531" containerName="mariadb-database-create" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.234735 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3786e2-123b-4300-86fb-505483139531" containerName="mariadb-database-create" Feb 17 15:10:49 crc kubenswrapper[4717]: E0217 15:10:49.234759 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb161a2-7b80-400b-a4eb-ca14e4386721" containerName="mariadb-account-create-update" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.234777 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb161a2-7b80-400b-a4eb-ca14e4386721" containerName="mariadb-account-create-update" Feb 17 15:10:49 crc kubenswrapper[4717]: E0217 15:10:49.234814 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d66078-a470-4a86-92f9-b22752be660a" containerName="mariadb-account-create-update" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.234833 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d66078-a470-4a86-92f9-b22752be660a" containerName="mariadb-account-create-update" Feb 17 15:10:49 crc kubenswrapper[4717]: E0217 15:10:49.234879 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e668f2-e8a2-403b-af75-9ea042dad451" containerName="mariadb-account-create-update" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.234893 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e668f2-e8a2-403b-af75-9ea042dad451" containerName="mariadb-account-create-update" Feb 17 15:10:49 crc kubenswrapper[4717]: E0217 15:10:49.234918 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a4229e-485b-43ed-80aa-a9f3212a132c" containerName="ovn-config" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.234934 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a4229e-485b-43ed-80aa-a9f3212a132c" containerName="ovn-config" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.235306 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e668f2-e8a2-403b-af75-9ea042dad451" containerName="mariadb-account-create-update" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.235354 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9527cb55-cce6-4fb4-aa5c-b2495a6264f6" containerName="mariadb-account-create-update" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.235373 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3786e2-123b-4300-86fb-505483139531" containerName="mariadb-database-create" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.235397 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d66078-a470-4a86-92f9-b22752be660a" containerName="mariadb-account-create-update" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.235420 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb161a2-7b80-400b-a4eb-ca14e4386721" containerName="mariadb-account-create-update" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.235490 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a4229e-485b-43ed-80aa-a9f3212a132c" containerName="ovn-config" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.236799 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.246056 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.263707 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dgrmv" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.271020 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hvpfc"] Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.372029 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-combined-ca-bundle\") pod \"glance-db-sync-hvpfc\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.372232 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vfh2\" (UniqueName: \"kubernetes.io/projected/0b96c066-9919-4133-93df-69c9abdc0c6c-kube-api-access-4vfh2\") pod \"glance-db-sync-hvpfc\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.372304 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-config-data\") pod \"glance-db-sync-hvpfc\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.372521 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-db-sync-config-data\") pod \"glance-db-sync-hvpfc\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.474386 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vfh2\" (UniqueName: \"kubernetes.io/projected/0b96c066-9919-4133-93df-69c9abdc0c6c-kube-api-access-4vfh2\") pod \"glance-db-sync-hvpfc\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.474469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-config-data\") pod \"glance-db-sync-hvpfc\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.474595 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-db-sync-config-data\") pod \"glance-db-sync-hvpfc\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.474669 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-combined-ca-bundle\") pod \"glance-db-sync-hvpfc\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.479938 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-db-sync-config-data\") pod \"glance-db-sync-hvpfc\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.482973 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-config-data\") pod \"glance-db-sync-hvpfc\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.487597 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-combined-ca-bundle\") pod \"glance-db-sync-hvpfc\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.491779 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vfh2\" (UniqueName: \"kubernetes.io/projected/0b96c066-9919-4133-93df-69c9abdc0c6c-kube-api-access-4vfh2\") pod \"glance-db-sync-hvpfc\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.573485 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hvpfc" Feb 17 15:10:49 crc kubenswrapper[4717]: I0217 15:10:49.694626 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:10:49 crc kubenswrapper[4717]: E0217 15:10:49.694988 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:10:49 crc kubenswrapper[4717]: E0217 15:10:49.695011 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:10:49 crc kubenswrapper[4717]: E0217 15:10:49.695056 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift podName:518c6b94-614f-42fd-9016-122cdcfcb8c9 nodeName:}" failed. No retries permitted until 2026-02-17 15:11:21.695042653 +0000 UTC m=+1148.110883129 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift") pod "swift-storage-0" (UID: "518c6b94-614f-42fd-9016-122cdcfcb8c9") : configmap "swift-ring-files" not found Feb 17 15:10:50 crc kubenswrapper[4717]: I0217 15:10:50.115839 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hvpfc"] Feb 17 15:10:50 crc kubenswrapper[4717]: I0217 15:10:50.325879 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hvpfc" event={"ID":"0b96c066-9919-4133-93df-69c9abdc0c6c","Type":"ContainerStarted","Data":"c5100cd04039cf9f5c4e9b024412ef27d7a094f8871576ee15700b4fd88ef6b0"} Feb 17 15:10:50 crc kubenswrapper[4717]: I0217 15:10:50.900025 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.269275 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.279473 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-v77mr"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.280850 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v77mr" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.303753 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-v77mr"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.353139 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rhzj6" event={"ID":"929feb4c-fd82-4293-9a44-a6f53816cdae","Type":"ContainerStarted","Data":"297d7dee7de3559a08fc309ab61ae39f0bee2739bcfc142f2114a1aa80eb58df"} Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.381725 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-rhzj6" podStartSLOduration=1.9942992400000001 podStartE2EDuration="33.3817051s" podCreationTimestamp="2026-02-17 15:10:18 +0000 UTC" firstStartedPulling="2026-02-17 15:10:19.060520878 +0000 UTC m=+1085.476361354" lastFinishedPulling="2026-02-17 15:10:50.447926738 +0000 UTC m=+1116.863767214" observedRunningTime="2026-02-17 15:10:51.37679913 +0000 UTC m=+1117.792639626" watchObservedRunningTime="2026-02-17 15:10:51.3817051 +0000 UTC m=+1117.797545566" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.413313 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-716d-account-create-update-npdx8"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.414967 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-716d-account-create-update-npdx8" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.417164 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.421300 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-716d-account-create-update-npdx8"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.434836 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkfsx\" (UniqueName: \"kubernetes.io/projected/ceff3034-f756-4cc9-9b21-3c38aed2b429-kube-api-access-zkfsx\") pod \"cinder-db-create-v77mr\" (UID: \"ceff3034-f756-4cc9-9b21-3c38aed2b429\") " pod="openstack/cinder-db-create-v77mr" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.435925 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ceff3034-f756-4cc9-9b21-3c38aed2b429-operator-scripts\") pod \"cinder-db-create-v77mr\" (UID: \"ceff3034-f756-4cc9-9b21-3c38aed2b429\") " pod="openstack/cinder-db-create-v77mr" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.479071 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6b77v"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.480347 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6b77v" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.491888 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6b77v"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.537866 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkfsx\" (UniqueName: \"kubernetes.io/projected/ceff3034-f756-4cc9-9b21-3c38aed2b429-kube-api-access-zkfsx\") pod \"cinder-db-create-v77mr\" (UID: \"ceff3034-f756-4cc9-9b21-3c38aed2b429\") " pod="openstack/cinder-db-create-v77mr" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.537964 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4c8g\" (UniqueName: \"kubernetes.io/projected/bf67c089-b17d-42bb-9cdf-3b5252b212c1-kube-api-access-j4c8g\") pod \"cinder-716d-account-create-update-npdx8\" (UID: \"bf67c089-b17d-42bb-9cdf-3b5252b212c1\") " pod="openstack/cinder-716d-account-create-update-npdx8" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.537988 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf67c089-b17d-42bb-9cdf-3b5252b212c1-operator-scripts\") pod \"cinder-716d-account-create-update-npdx8\" (UID: \"bf67c089-b17d-42bb-9cdf-3b5252b212c1\") " pod="openstack/cinder-716d-account-create-update-npdx8" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.538042 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ceff3034-f756-4cc9-9b21-3c38aed2b429-operator-scripts\") pod \"cinder-db-create-v77mr\" (UID: \"ceff3034-f756-4cc9-9b21-3c38aed2b429\") " pod="openstack/cinder-db-create-v77mr" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.539058 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ceff3034-f756-4cc9-9b21-3c38aed2b429-operator-scripts\") pod \"cinder-db-create-v77mr\" (UID: \"ceff3034-f756-4cc9-9b21-3c38aed2b429\") " pod="openstack/cinder-db-create-v77mr" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.584704 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkfsx\" (UniqueName: \"kubernetes.io/projected/ceff3034-f756-4cc9-9b21-3c38aed2b429-kube-api-access-zkfsx\") pod \"cinder-db-create-v77mr\" (UID: \"ceff3034-f756-4cc9-9b21-3c38aed2b429\") " pod="openstack/cinder-db-create-v77mr" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.603983 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v77mr" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.621553 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-63ba-account-create-update-phs52"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.624602 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-63ba-account-create-update-phs52" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.628630 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.633397 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-63ba-account-create-update-phs52"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.640238 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8tpc\" (UniqueName: \"kubernetes.io/projected/e74a323a-cd4e-435f-beaa-9d3b6689e98c-kube-api-access-r8tpc\") pod \"barbican-db-create-6b77v\" (UID: \"e74a323a-cd4e-435f-beaa-9d3b6689e98c\") " pod="openstack/barbican-db-create-6b77v" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.640407 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4c8g\" (UniqueName: \"kubernetes.io/projected/bf67c089-b17d-42bb-9cdf-3b5252b212c1-kube-api-access-j4c8g\") pod \"cinder-716d-account-create-update-npdx8\" (UID: \"bf67c089-b17d-42bb-9cdf-3b5252b212c1\") " pod="openstack/cinder-716d-account-create-update-npdx8" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.640432 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf67c089-b17d-42bb-9cdf-3b5252b212c1-operator-scripts\") pod \"cinder-716d-account-create-update-npdx8\" (UID: \"bf67c089-b17d-42bb-9cdf-3b5252b212c1\") " pod="openstack/cinder-716d-account-create-update-npdx8" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.640464 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e74a323a-cd4e-435f-beaa-9d3b6689e98c-operator-scripts\") pod \"barbican-db-create-6b77v\" (UID: \"e74a323a-cd4e-435f-beaa-9d3b6689e98c\") " pod="openstack/barbican-db-create-6b77v" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.641497 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf67c089-b17d-42bb-9cdf-3b5252b212c1-operator-scripts\") pod \"cinder-716d-account-create-update-npdx8\" (UID: \"bf67c089-b17d-42bb-9cdf-3b5252b212c1\") " pod="openstack/cinder-716d-account-create-update-npdx8" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.677628 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4c8g\" (UniqueName: \"kubernetes.io/projected/bf67c089-b17d-42bb-9cdf-3b5252b212c1-kube-api-access-j4c8g\") pod \"cinder-716d-account-create-update-npdx8\" (UID: \"bf67c089-b17d-42bb-9cdf-3b5252b212c1\") " pod="openstack/cinder-716d-account-create-update-npdx8" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.687411 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-vsfnm"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.689779 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vsfnm" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.695346 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vsfnm"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.732691 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-716d-account-create-update-npdx8" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.741711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e0a6ff8-9436-409a-b86f-df23c821c302-operator-scripts\") pod \"barbican-63ba-account-create-update-phs52\" (UID: \"7e0a6ff8-9436-409a-b86f-df23c821c302\") " pod="openstack/barbican-63ba-account-create-update-phs52" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.741807 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e74a323a-cd4e-435f-beaa-9d3b6689e98c-operator-scripts\") pod \"barbican-db-create-6b77v\" (UID: \"e74a323a-cd4e-435f-beaa-9d3b6689e98c\") " pod="openstack/barbican-db-create-6b77v" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.741854 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8tpc\" (UniqueName: \"kubernetes.io/projected/e74a323a-cd4e-435f-beaa-9d3b6689e98c-kube-api-access-r8tpc\") pod \"barbican-db-create-6b77v\" (UID: \"e74a323a-cd4e-435f-beaa-9d3b6689e98c\") " pod="openstack/barbican-db-create-6b77v" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.741920 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r46pg\" (UniqueName: \"kubernetes.io/projected/7e0a6ff8-9436-409a-b86f-df23c821c302-kube-api-access-r46pg\") pod \"barbican-63ba-account-create-update-phs52\" (UID: \"7e0a6ff8-9436-409a-b86f-df23c821c302\") " pod="openstack/barbican-63ba-account-create-update-phs52" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.742827 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e74a323a-cd4e-435f-beaa-9d3b6689e98c-operator-scripts\") pod \"barbican-db-create-6b77v\" (UID: \"e74a323a-cd4e-435f-beaa-9d3b6689e98c\") " pod="openstack/barbican-db-create-6b77v" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.763005 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kvdwd"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.764029 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.775402 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.775576 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sdkz9" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.775749 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.775879 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.785138 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kvdwd"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.802402 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8tpc\" (UniqueName: \"kubernetes.io/projected/e74a323a-cd4e-435f-beaa-9d3b6689e98c-kube-api-access-r8tpc\") pod \"barbican-db-create-6b77v\" (UID: \"e74a323a-cd4e-435f-beaa-9d3b6689e98c\") " pod="openstack/barbican-db-create-6b77v" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.803143 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-27f0-account-create-update-59n4f"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.804347 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-27f0-account-create-update-59n4f" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.810577 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.828729 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-27f0-account-create-update-59n4f"] Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.844899 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khzfj\" (UniqueName: \"kubernetes.io/projected/bd826039-5737-4f09-b722-e6263c314341-kube-api-access-khzfj\") pod \"neutron-db-create-vsfnm\" (UID: \"bd826039-5737-4f09-b722-e6263c314341\") " pod="openstack/neutron-db-create-vsfnm" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.844976 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e0a6ff8-9436-409a-b86f-df23c821c302-operator-scripts\") pod \"barbican-63ba-account-create-update-phs52\" (UID: \"7e0a6ff8-9436-409a-b86f-df23c821c302\") " pod="openstack/barbican-63ba-account-create-update-phs52" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.845023 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd826039-5737-4f09-b722-e6263c314341-operator-scripts\") pod \"neutron-db-create-vsfnm\" (UID: \"bd826039-5737-4f09-b722-e6263c314341\") " pod="openstack/neutron-db-create-vsfnm" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.845066 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkn8s\" (UniqueName: \"kubernetes.io/projected/35f16717-cc42-4465-8a1e-b7377b11b987-kube-api-access-xkn8s\") pod \"keystone-db-sync-kvdwd\" (UID: \"35f16717-cc42-4465-8a1e-b7377b11b987\") " pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.845117 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f16717-cc42-4465-8a1e-b7377b11b987-config-data\") pod \"keystone-db-sync-kvdwd\" (UID: \"35f16717-cc42-4465-8a1e-b7377b11b987\") " pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.845145 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f16717-cc42-4465-8a1e-b7377b11b987-combined-ca-bundle\") pod \"keystone-db-sync-kvdwd\" (UID: \"35f16717-cc42-4465-8a1e-b7377b11b987\") " pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.845165 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r46pg\" (UniqueName: \"kubernetes.io/projected/7e0a6ff8-9436-409a-b86f-df23c821c302-kube-api-access-r46pg\") pod \"barbican-63ba-account-create-update-phs52\" (UID: \"7e0a6ff8-9436-409a-b86f-df23c821c302\") " pod="openstack/barbican-63ba-account-create-update-phs52" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.846450 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e0a6ff8-9436-409a-b86f-df23c821c302-operator-scripts\") pod \"barbican-63ba-account-create-update-phs52\" (UID: \"7e0a6ff8-9436-409a-b86f-df23c821c302\") " pod="openstack/barbican-63ba-account-create-update-phs52" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.878094 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r46pg\" (UniqueName: \"kubernetes.io/projected/7e0a6ff8-9436-409a-b86f-df23c821c302-kube-api-access-r46pg\") pod \"barbican-63ba-account-create-update-phs52\" (UID: \"7e0a6ff8-9436-409a-b86f-df23c821c302\") " pod="openstack/barbican-63ba-account-create-update-phs52" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.946725 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkn8s\" (UniqueName: \"kubernetes.io/projected/35f16717-cc42-4465-8a1e-b7377b11b987-kube-api-access-xkn8s\") pod \"keystone-db-sync-kvdwd\" (UID: \"35f16717-cc42-4465-8a1e-b7377b11b987\") " pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.946776 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f16717-cc42-4465-8a1e-b7377b11b987-config-data\") pod \"keystone-db-sync-kvdwd\" (UID: \"35f16717-cc42-4465-8a1e-b7377b11b987\") " pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.946820 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f16717-cc42-4465-8a1e-b7377b11b987-combined-ca-bundle\") pod \"keystone-db-sync-kvdwd\" (UID: \"35f16717-cc42-4465-8a1e-b7377b11b987\") " pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.946874 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms2fc\" (UniqueName: \"kubernetes.io/projected/44668081-3deb-40f0-a60e-302ee0a8b85a-kube-api-access-ms2fc\") pod \"neutron-27f0-account-create-update-59n4f\" (UID: \"44668081-3deb-40f0-a60e-302ee0a8b85a\") " pod="openstack/neutron-27f0-account-create-update-59n4f" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.946919 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khzfj\" (UniqueName: \"kubernetes.io/projected/bd826039-5737-4f09-b722-e6263c314341-kube-api-access-khzfj\") pod \"neutron-db-create-vsfnm\" (UID: \"bd826039-5737-4f09-b722-e6263c314341\") " pod="openstack/neutron-db-create-vsfnm" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.946984 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44668081-3deb-40f0-a60e-302ee0a8b85a-operator-scripts\") pod \"neutron-27f0-account-create-update-59n4f\" (UID: \"44668081-3deb-40f0-a60e-302ee0a8b85a\") " pod="openstack/neutron-27f0-account-create-update-59n4f" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.947007 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd826039-5737-4f09-b722-e6263c314341-operator-scripts\") pod \"neutron-db-create-vsfnm\" (UID: \"bd826039-5737-4f09-b722-e6263c314341\") " pod="openstack/neutron-db-create-vsfnm" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.947707 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd826039-5737-4f09-b722-e6263c314341-operator-scripts\") pod \"neutron-db-create-vsfnm\" (UID: \"bd826039-5737-4f09-b722-e6263c314341\") " pod="openstack/neutron-db-create-vsfnm" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.954926 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f16717-cc42-4465-8a1e-b7377b11b987-combined-ca-bundle\") pod \"keystone-db-sync-kvdwd\" (UID: \"35f16717-cc42-4465-8a1e-b7377b11b987\") " pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.964010 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f16717-cc42-4465-8a1e-b7377b11b987-config-data\") pod \"keystone-db-sync-kvdwd\" (UID: \"35f16717-cc42-4465-8a1e-b7377b11b987\") " pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.964608 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkn8s\" (UniqueName: \"kubernetes.io/projected/35f16717-cc42-4465-8a1e-b7377b11b987-kube-api-access-xkn8s\") pod \"keystone-db-sync-kvdwd\" (UID: \"35f16717-cc42-4465-8a1e-b7377b11b987\") " pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:10:51 crc kubenswrapper[4717]: I0217 15:10:51.966022 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khzfj\" (UniqueName: \"kubernetes.io/projected/bd826039-5737-4f09-b722-e6263c314341-kube-api-access-khzfj\") pod \"neutron-db-create-vsfnm\" (UID: \"bd826039-5737-4f09-b722-e6263c314341\") " pod="openstack/neutron-db-create-vsfnm" Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.048954 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms2fc\" (UniqueName: \"kubernetes.io/projected/44668081-3deb-40f0-a60e-302ee0a8b85a-kube-api-access-ms2fc\") pod \"neutron-27f0-account-create-update-59n4f\" (UID: \"44668081-3deb-40f0-a60e-302ee0a8b85a\") " pod="openstack/neutron-27f0-account-create-update-59n4f" Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.049055 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44668081-3deb-40f0-a60e-302ee0a8b85a-operator-scripts\") pod \"neutron-27f0-account-create-update-59n4f\" (UID: \"44668081-3deb-40f0-a60e-302ee0a8b85a\") " pod="openstack/neutron-27f0-account-create-update-59n4f" Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.049766 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44668081-3deb-40f0-a60e-302ee0a8b85a-operator-scripts\") pod \"neutron-27f0-account-create-update-59n4f\" (UID: \"44668081-3deb-40f0-a60e-302ee0a8b85a\") " pod="openstack/neutron-27f0-account-create-update-59n4f" Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.067336 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms2fc\" (UniqueName: \"kubernetes.io/projected/44668081-3deb-40f0-a60e-302ee0a8b85a-kube-api-access-ms2fc\") pod \"neutron-27f0-account-create-update-59n4f\" (UID: \"44668081-3deb-40f0-a60e-302ee0a8b85a\") " pod="openstack/neutron-27f0-account-create-update-59n4f" Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.070532 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-63ba-account-create-update-phs52" Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.097626 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6b77v" Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.116510 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vsfnm" Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.134691 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.134957 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-716d-account-create-update-npdx8"] Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.151006 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-27f0-account-create-update-59n4f" Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.218062 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-v77mr"] Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.367470 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-716d-account-create-update-npdx8" event={"ID":"bf67c089-b17d-42bb-9cdf-3b5252b212c1","Type":"ContainerStarted","Data":"c90ef171ea28a7a3597da2fb50285bb813b081df6469b471e33316b4f09a48a2"} Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.368910 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v77mr" event={"ID":"ceff3034-f756-4cc9-9b21-3c38aed2b429","Type":"ContainerStarted","Data":"e7305afce9c90ecf517b1572a9fe11c57cd5e9d71ac5e8e299bf27b9ba47af79"} Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.730450 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-63ba-account-create-update-phs52"] Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.774630 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vsfnm"] Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.786806 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6b77v"] Feb 17 15:10:52 crc kubenswrapper[4717]: W0217 15:10:52.799507 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd826039_5737_4f09_b722_e6263c314341.slice/crio-61689a80ecb889fb0ff873c0029f47ab63f7158d6ae59b423668145778f18aad WatchSource:0}: Error finding container 61689a80ecb889fb0ff873c0029f47ab63f7158d6ae59b423668145778f18aad: Status 404 returned error can't find the container with id 61689a80ecb889fb0ff873c0029f47ab63f7158d6ae59b423668145778f18aad Feb 17 15:10:52 crc kubenswrapper[4717]: W0217 15:10:52.802228 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode74a323a_cd4e_435f_beaa_9d3b6689e98c.slice/crio-d47123bb74a01f95ba3611aa903d22439c64303604d7a31aeb194f1424cbe773 WatchSource:0}: Error finding container d47123bb74a01f95ba3611aa903d22439c64303604d7a31aeb194f1424cbe773: Status 404 returned error can't find the container with id d47123bb74a01f95ba3611aa903d22439c64303604d7a31aeb194f1424cbe773 Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.915309 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-27f0-account-create-update-59n4f"] Feb 17 15:10:52 crc kubenswrapper[4717]: W0217 15:10:52.931358 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44668081_3deb_40f0_a60e_302ee0a8b85a.slice/crio-2823ee62405a6a4d9c8e2caef06d24c8f1e975873b6c4f51ef0022357d04a43d WatchSource:0}: Error finding container 2823ee62405a6a4d9c8e2caef06d24c8f1e975873b6c4f51ef0022357d04a43d: Status 404 returned error can't find the container with id 2823ee62405a6a4d9c8e2caef06d24c8f1e975873b6c4f51ef0022357d04a43d Feb 17 15:10:52 crc kubenswrapper[4717]: I0217 15:10:52.936510 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kvdwd"] Feb 17 15:10:53 crc kubenswrapper[4717]: E0217 15:10:53.229184 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podceff3034_f756_4cc9_9b21_3c38aed2b429.slice/crio-conmon-13f05a509876ea4dab3270bff59fb10757072e420fe157431ed2b3f7024dc964.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf67c089_b17d_42bb_9cdf_3b5252b212c1.slice/crio-3d6ba34bc3431518d9c7baeb3ed11e5dc4ddcaed3cc8df7d6b8381813a94d076.scope\": RecentStats: unable to find data in memory cache]" Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.384707 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vsfnm" event={"ID":"bd826039-5737-4f09-b722-e6263c314341","Type":"ContainerStarted","Data":"8530531676be825a0cc4aec8a3b4f9dcc05d36a222dad3ae51e91e2760f22dab"} Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.384766 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vsfnm" event={"ID":"bd826039-5737-4f09-b722-e6263c314341","Type":"ContainerStarted","Data":"61689a80ecb889fb0ff873c0029f47ab63f7158d6ae59b423668145778f18aad"} Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.388185 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6b77v" event={"ID":"e74a323a-cd4e-435f-beaa-9d3b6689e98c","Type":"ContainerStarted","Data":"c15c6278b2ff88b6bdc31bb1c6e474cadfe194c14c9d6f908a15298bdf9ea965"} Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.388231 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6b77v" event={"ID":"e74a323a-cd4e-435f-beaa-9d3b6689e98c","Type":"ContainerStarted","Data":"d47123bb74a01f95ba3611aa903d22439c64303604d7a31aeb194f1424cbe773"} Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.393137 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-63ba-account-create-update-phs52" event={"ID":"7e0a6ff8-9436-409a-b86f-df23c821c302","Type":"ContainerStarted","Data":"0fbede69481c75260fa48a7c223c63adac345952e5dbf8816b3cacc2d8a97fab"} Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.393180 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-63ba-account-create-update-phs52" event={"ID":"7e0a6ff8-9436-409a-b86f-df23c821c302","Type":"ContainerStarted","Data":"4548d8d74d034964e315a25c233ee136b236f7fb64018d6aaf1d9d7f5d7704b7"} Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.395715 4717 generic.go:334] "Generic (PLEG): container finished" podID="ceff3034-f756-4cc9-9b21-3c38aed2b429" containerID="13f05a509876ea4dab3270bff59fb10757072e420fe157431ed2b3f7024dc964" exitCode=0 Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.395759 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v77mr" event={"ID":"ceff3034-f756-4cc9-9b21-3c38aed2b429","Type":"ContainerDied","Data":"13f05a509876ea4dab3270bff59fb10757072e420fe157431ed2b3f7024dc964"} Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.400924 4717 generic.go:334] "Generic (PLEG): container finished" podID="bf67c089-b17d-42bb-9cdf-3b5252b212c1" containerID="3d6ba34bc3431518d9c7baeb3ed11e5dc4ddcaed3cc8df7d6b8381813a94d076" exitCode=0 Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.400968 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-716d-account-create-update-npdx8" event={"ID":"bf67c089-b17d-42bb-9cdf-3b5252b212c1","Type":"ContainerDied","Data":"3d6ba34bc3431518d9c7baeb3ed11e5dc4ddcaed3cc8df7d6b8381813a94d076"} Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.408764 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-vsfnm" podStartSLOduration=2.40874378 podStartE2EDuration="2.40874378s" podCreationTimestamp="2026-02-17 15:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:53.407095123 +0000 UTC m=+1119.822935609" watchObservedRunningTime="2026-02-17 15:10:53.40874378 +0000 UTC m=+1119.824584266" Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.410208 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kvdwd" event={"ID":"35f16717-cc42-4465-8a1e-b7377b11b987","Type":"ContainerStarted","Data":"c2cf94b44f3bd9e0f48c889a4f50ef0e03dc8290eb0eef17fea793f1b8fe89cd"} Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.412101 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-27f0-account-create-update-59n4f" event={"ID":"44668081-3deb-40f0-a60e-302ee0a8b85a","Type":"ContainerStarted","Data":"94672654284a0165636c6d9fe9de12d3bde736cb946139baf5afc940ef05d8b9"} Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.412148 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-27f0-account-create-update-59n4f" event={"ID":"44668081-3deb-40f0-a60e-302ee0a8b85a","Type":"ContainerStarted","Data":"2823ee62405a6a4d9c8e2caef06d24c8f1e975873b6c4f51ef0022357d04a43d"} Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.448576 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-6b77v" podStartSLOduration=2.448557435 podStartE2EDuration="2.448557435s" podCreationTimestamp="2026-02-17 15:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:53.443522542 +0000 UTC m=+1119.859363018" watchObservedRunningTime="2026-02-17 15:10:53.448557435 +0000 UTC m=+1119.864397901" Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.468186 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-63ba-account-create-update-phs52" podStartSLOduration=2.468165924 podStartE2EDuration="2.468165924s" podCreationTimestamp="2026-02-17 15:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:53.463491801 +0000 UTC m=+1119.879332277" watchObservedRunningTime="2026-02-17 15:10:53.468165924 +0000 UTC m=+1119.884006400" Feb 17 15:10:53 crc kubenswrapper[4717]: I0217 15:10:53.500937 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-27f0-account-create-update-59n4f" podStartSLOduration=2.500907648 podStartE2EDuration="2.500907648s" podCreationTimestamp="2026-02-17 15:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:10:53.500265899 +0000 UTC m=+1119.916106385" watchObservedRunningTime="2026-02-17 15:10:53.500907648 +0000 UTC m=+1119.916748124" Feb 17 15:10:54 crc kubenswrapper[4717]: I0217 15:10:54.425849 4717 generic.go:334] "Generic (PLEG): container finished" podID="e74a323a-cd4e-435f-beaa-9d3b6689e98c" containerID="c15c6278b2ff88b6bdc31bb1c6e474cadfe194c14c9d6f908a15298bdf9ea965" exitCode=0 Feb 17 15:10:54 crc kubenswrapper[4717]: I0217 15:10:54.425929 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6b77v" event={"ID":"e74a323a-cd4e-435f-beaa-9d3b6689e98c","Type":"ContainerDied","Data":"c15c6278b2ff88b6bdc31bb1c6e474cadfe194c14c9d6f908a15298bdf9ea965"} Feb 17 15:10:54 crc kubenswrapper[4717]: I0217 15:10:54.428771 4717 generic.go:334] "Generic (PLEG): container finished" podID="7e0a6ff8-9436-409a-b86f-df23c821c302" containerID="0fbede69481c75260fa48a7c223c63adac345952e5dbf8816b3cacc2d8a97fab" exitCode=0 Feb 17 15:10:54 crc kubenswrapper[4717]: I0217 15:10:54.428851 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-63ba-account-create-update-phs52" event={"ID":"7e0a6ff8-9436-409a-b86f-df23c821c302","Type":"ContainerDied","Data":"0fbede69481c75260fa48a7c223c63adac345952e5dbf8816b3cacc2d8a97fab"} Feb 17 15:10:54 crc kubenswrapper[4717]: I0217 15:10:54.430880 4717 generic.go:334] "Generic (PLEG): container finished" podID="44668081-3deb-40f0-a60e-302ee0a8b85a" containerID="94672654284a0165636c6d9fe9de12d3bde736cb946139baf5afc940ef05d8b9" exitCode=0 Feb 17 15:10:54 crc kubenswrapper[4717]: I0217 15:10:54.430948 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-27f0-account-create-update-59n4f" event={"ID":"44668081-3deb-40f0-a60e-302ee0a8b85a","Type":"ContainerDied","Data":"94672654284a0165636c6d9fe9de12d3bde736cb946139baf5afc940ef05d8b9"} Feb 17 15:10:54 crc kubenswrapper[4717]: I0217 15:10:54.432574 4717 generic.go:334] "Generic (PLEG): container finished" podID="bd826039-5737-4f09-b722-e6263c314341" containerID="8530531676be825a0cc4aec8a3b4f9dcc05d36a222dad3ae51e91e2760f22dab" exitCode=0 Feb 17 15:10:54 crc kubenswrapper[4717]: I0217 15:10:54.432608 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vsfnm" event={"ID":"bd826039-5737-4f09-b722-e6263c314341","Type":"ContainerDied","Data":"8530531676be825a0cc4aec8a3b4f9dcc05d36a222dad3ae51e91e2760f22dab"} Feb 17 15:10:54 crc kubenswrapper[4717]: I0217 15:10:54.884987 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v77mr" Feb 17 15:10:54 crc kubenswrapper[4717]: I0217 15:10:54.980028 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-716d-account-create-update-npdx8" Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.011711 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ceff3034-f756-4cc9-9b21-3c38aed2b429-operator-scripts\") pod \"ceff3034-f756-4cc9-9b21-3c38aed2b429\" (UID: \"ceff3034-f756-4cc9-9b21-3c38aed2b429\") " Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.011880 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkfsx\" (UniqueName: \"kubernetes.io/projected/ceff3034-f756-4cc9-9b21-3c38aed2b429-kube-api-access-zkfsx\") pod \"ceff3034-f756-4cc9-9b21-3c38aed2b429\" (UID: \"ceff3034-f756-4cc9-9b21-3c38aed2b429\") " Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.012095 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceff3034-f756-4cc9-9b21-3c38aed2b429-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ceff3034-f756-4cc9-9b21-3c38aed2b429" (UID: "ceff3034-f756-4cc9-9b21-3c38aed2b429"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.012527 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ceff3034-f756-4cc9-9b21-3c38aed2b429-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.017372 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceff3034-f756-4cc9-9b21-3c38aed2b429-kube-api-access-zkfsx" (OuterVolumeSpecName: "kube-api-access-zkfsx") pod "ceff3034-f756-4cc9-9b21-3c38aed2b429" (UID: "ceff3034-f756-4cc9-9b21-3c38aed2b429"). InnerVolumeSpecName "kube-api-access-zkfsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.114877 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4c8g\" (UniqueName: \"kubernetes.io/projected/bf67c089-b17d-42bb-9cdf-3b5252b212c1-kube-api-access-j4c8g\") pod \"bf67c089-b17d-42bb-9cdf-3b5252b212c1\" (UID: \"bf67c089-b17d-42bb-9cdf-3b5252b212c1\") " Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.114937 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf67c089-b17d-42bb-9cdf-3b5252b212c1-operator-scripts\") pod \"bf67c089-b17d-42bb-9cdf-3b5252b212c1\" (UID: \"bf67c089-b17d-42bb-9cdf-3b5252b212c1\") " Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.115390 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkfsx\" (UniqueName: \"kubernetes.io/projected/ceff3034-f756-4cc9-9b21-3c38aed2b429-kube-api-access-zkfsx\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.115475 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf67c089-b17d-42bb-9cdf-3b5252b212c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf67c089-b17d-42bb-9cdf-3b5252b212c1" (UID: "bf67c089-b17d-42bb-9cdf-3b5252b212c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.117791 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf67c089-b17d-42bb-9cdf-3b5252b212c1-kube-api-access-j4c8g" (OuterVolumeSpecName: "kube-api-access-j4c8g") pod "bf67c089-b17d-42bb-9cdf-3b5252b212c1" (UID: "bf67c089-b17d-42bb-9cdf-3b5252b212c1"). InnerVolumeSpecName "kube-api-access-j4c8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.216788 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4c8g\" (UniqueName: \"kubernetes.io/projected/bf67c089-b17d-42bb-9cdf-3b5252b212c1-kube-api-access-j4c8g\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.217111 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf67c089-b17d-42bb-9cdf-3b5252b212c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.451110 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v77mr" event={"ID":"ceff3034-f756-4cc9-9b21-3c38aed2b429","Type":"ContainerDied","Data":"e7305afce9c90ecf517b1572a9fe11c57cd5e9d71ac5e8e299bf27b9ba47af79"} Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.451160 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7305afce9c90ecf517b1572a9fe11c57cd5e9d71ac5e8e299bf27b9ba47af79" Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.451257 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v77mr" Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.454204 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-716d-account-create-update-npdx8" event={"ID":"bf67c089-b17d-42bb-9cdf-3b5252b212c1","Type":"ContainerDied","Data":"c90ef171ea28a7a3597da2fb50285bb813b081df6469b471e33316b4f09a48a2"} Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.454447 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c90ef171ea28a7a3597da2fb50285bb813b081df6469b471e33316b4f09a48a2" Feb 17 15:10:55 crc kubenswrapper[4717]: I0217 15:10:55.454651 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-716d-account-create-update-npdx8" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.330567 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-63ba-account-create-update-phs52" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.338565 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-27f0-account-create-update-59n4f" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.478693 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms2fc\" (UniqueName: \"kubernetes.io/projected/44668081-3deb-40f0-a60e-302ee0a8b85a-kube-api-access-ms2fc\") pod \"44668081-3deb-40f0-a60e-302ee0a8b85a\" (UID: \"44668081-3deb-40f0-a60e-302ee0a8b85a\") " Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.478873 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e0a6ff8-9436-409a-b86f-df23c821c302-operator-scripts\") pod \"7e0a6ff8-9436-409a-b86f-df23c821c302\" (UID: \"7e0a6ff8-9436-409a-b86f-df23c821c302\") " Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.478936 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44668081-3deb-40f0-a60e-302ee0a8b85a-operator-scripts\") pod \"44668081-3deb-40f0-a60e-302ee0a8b85a\" (UID: \"44668081-3deb-40f0-a60e-302ee0a8b85a\") " Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.479001 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r46pg\" (UniqueName: \"kubernetes.io/projected/7e0a6ff8-9436-409a-b86f-df23c821c302-kube-api-access-r46pg\") pod \"7e0a6ff8-9436-409a-b86f-df23c821c302\" (UID: \"7e0a6ff8-9436-409a-b86f-df23c821c302\") " Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.479726 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e0a6ff8-9436-409a-b86f-df23c821c302-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e0a6ff8-9436-409a-b86f-df23c821c302" (UID: "7e0a6ff8-9436-409a-b86f-df23c821c302"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.480113 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44668081-3deb-40f0-a60e-302ee0a8b85a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44668081-3deb-40f0-a60e-302ee0a8b85a" (UID: "44668081-3deb-40f0-a60e-302ee0a8b85a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.485529 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0a6ff8-9436-409a-b86f-df23c821c302-kube-api-access-r46pg" (OuterVolumeSpecName: "kube-api-access-r46pg") pod "7e0a6ff8-9436-409a-b86f-df23c821c302" (UID: "7e0a6ff8-9436-409a-b86f-df23c821c302"). InnerVolumeSpecName "kube-api-access-r46pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.486246 4717 generic.go:334] "Generic (PLEG): container finished" podID="929feb4c-fd82-4293-9a44-a6f53816cdae" containerID="297d7dee7de3559a08fc309ab61ae39f0bee2739bcfc142f2114a1aa80eb58df" exitCode=0 Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.486297 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rhzj6" event={"ID":"929feb4c-fd82-4293-9a44-a6f53816cdae","Type":"ContainerDied","Data":"297d7dee7de3559a08fc309ab61ae39f0bee2739bcfc142f2114a1aa80eb58df"} Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.491694 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-27f0-account-create-update-59n4f" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.492113 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-27f0-account-create-update-59n4f" event={"ID":"44668081-3deb-40f0-a60e-302ee0a8b85a","Type":"ContainerDied","Data":"2823ee62405a6a4d9c8e2caef06d24c8f1e975873b6c4f51ef0022357d04a43d"} Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.492143 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2823ee62405a6a4d9c8e2caef06d24c8f1e975873b6c4f51ef0022357d04a43d" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.493784 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-63ba-account-create-update-phs52" event={"ID":"7e0a6ff8-9436-409a-b86f-df23c821c302","Type":"ContainerDied","Data":"4548d8d74d034964e315a25c233ee136b236f7fb64018d6aaf1d9d7f5d7704b7"} Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.493812 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4548d8d74d034964e315a25c233ee136b236f7fb64018d6aaf1d9d7f5d7704b7" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.493849 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-63ba-account-create-update-phs52" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.493879 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44668081-3deb-40f0-a60e-302ee0a8b85a-kube-api-access-ms2fc" (OuterVolumeSpecName: "kube-api-access-ms2fc") pod "44668081-3deb-40f0-a60e-302ee0a8b85a" (UID: "44668081-3deb-40f0-a60e-302ee0a8b85a"). InnerVolumeSpecName "kube-api-access-ms2fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.580910 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e0a6ff8-9436-409a-b86f-df23c821c302-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.580939 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44668081-3deb-40f0-a60e-302ee0a8b85a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.580950 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r46pg\" (UniqueName: \"kubernetes.io/projected/7e0a6ff8-9436-409a-b86f-df23c821c302-kube-api-access-r46pg\") on node \"crc\" DevicePath \"\"" Feb 17 15:10:58 crc kubenswrapper[4717]: I0217 15:10:58.580960 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms2fc\" (UniqueName: \"kubernetes.io/projected/44668081-3deb-40f0-a60e-302ee0a8b85a-kube-api-access-ms2fc\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.215106 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vsfnm" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.221130 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6b77v" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.226882 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8tpc\" (UniqueName: \"kubernetes.io/projected/e74a323a-cd4e-435f-beaa-9d3b6689e98c-kube-api-access-r8tpc\") pod \"e74a323a-cd4e-435f-beaa-9d3b6689e98c\" (UID: \"e74a323a-cd4e-435f-beaa-9d3b6689e98c\") " Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.227023 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd826039-5737-4f09-b722-e6263c314341-operator-scripts\") pod \"bd826039-5737-4f09-b722-e6263c314341\" (UID: \"bd826039-5737-4f09-b722-e6263c314341\") " Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.227331 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khzfj\" (UniqueName: \"kubernetes.io/projected/bd826039-5737-4f09-b722-e6263c314341-kube-api-access-khzfj\") pod \"bd826039-5737-4f09-b722-e6263c314341\" (UID: \"bd826039-5737-4f09-b722-e6263c314341\") " Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.227434 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e74a323a-cd4e-435f-beaa-9d3b6689e98c-operator-scripts\") pod \"e74a323a-cd4e-435f-beaa-9d3b6689e98c\" (UID: \"e74a323a-cd4e-435f-beaa-9d3b6689e98c\") " Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.227852 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd826039-5737-4f09-b722-e6263c314341-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd826039-5737-4f09-b722-e6263c314341" (UID: "bd826039-5737-4f09-b722-e6263c314341"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.227946 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e74a323a-cd4e-435f-beaa-9d3b6689e98c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e74a323a-cd4e-435f-beaa-9d3b6689e98c" (UID: "e74a323a-cd4e-435f-beaa-9d3b6689e98c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.228254 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e74a323a-cd4e-435f-beaa-9d3b6689e98c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.228284 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd826039-5737-4f09-b722-e6263c314341-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.230778 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.238662 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e74a323a-cd4e-435f-beaa-9d3b6689e98c-kube-api-access-r8tpc" (OuterVolumeSpecName: "kube-api-access-r8tpc") pod "e74a323a-cd4e-435f-beaa-9d3b6689e98c" (UID: "e74a323a-cd4e-435f-beaa-9d3b6689e98c"). InnerVolumeSpecName "kube-api-access-r8tpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.247784 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd826039-5737-4f09-b722-e6263c314341-kube-api-access-khzfj" (OuterVolumeSpecName: "kube-api-access-khzfj") pod "bd826039-5737-4f09-b722-e6263c314341" (UID: "bd826039-5737-4f09-b722-e6263c314341"). InnerVolumeSpecName "kube-api-access-khzfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.329435 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-combined-ca-bundle\") pod \"929feb4c-fd82-4293-9a44-a6f53816cdae\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.329520 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-swiftconf\") pod \"929feb4c-fd82-4293-9a44-a6f53816cdae\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.329570 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/929feb4c-fd82-4293-9a44-a6f53816cdae-ring-data-devices\") pod \"929feb4c-fd82-4293-9a44-a6f53816cdae\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.329618 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svjk8\" (UniqueName: \"kubernetes.io/projected/929feb4c-fd82-4293-9a44-a6f53816cdae-kube-api-access-svjk8\") pod \"929feb4c-fd82-4293-9a44-a6f53816cdae\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.329654 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/929feb4c-fd82-4293-9a44-a6f53816cdae-scripts\") pod \"929feb4c-fd82-4293-9a44-a6f53816cdae\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.329676 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/929feb4c-fd82-4293-9a44-a6f53816cdae-etc-swift\") pod \"929feb4c-fd82-4293-9a44-a6f53816cdae\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.329919 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khzfj\" (UniqueName: \"kubernetes.io/projected/bd826039-5737-4f09-b722-e6263c314341-kube-api-access-khzfj\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.329932 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8tpc\" (UniqueName: \"kubernetes.io/projected/e74a323a-cd4e-435f-beaa-9d3b6689e98c-kube-api-access-r8tpc\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.330380 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/929feb4c-fd82-4293-9a44-a6f53816cdae-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "929feb4c-fd82-4293-9a44-a6f53816cdae" (UID: "929feb4c-fd82-4293-9a44-a6f53816cdae"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.330727 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/929feb4c-fd82-4293-9a44-a6f53816cdae-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "929feb4c-fd82-4293-9a44-a6f53816cdae" (UID: "929feb4c-fd82-4293-9a44-a6f53816cdae"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.333444 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/929feb4c-fd82-4293-9a44-a6f53816cdae-kube-api-access-svjk8" (OuterVolumeSpecName: "kube-api-access-svjk8") pod "929feb4c-fd82-4293-9a44-a6f53816cdae" (UID: "929feb4c-fd82-4293-9a44-a6f53816cdae"). InnerVolumeSpecName "kube-api-access-svjk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.349600 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/929feb4c-fd82-4293-9a44-a6f53816cdae-scripts" (OuterVolumeSpecName: "scripts") pod "929feb4c-fd82-4293-9a44-a6f53816cdae" (UID: "929feb4c-fd82-4293-9a44-a6f53816cdae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.351898 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "929feb4c-fd82-4293-9a44-a6f53816cdae" (UID: "929feb4c-fd82-4293-9a44-a6f53816cdae"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.359472 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "929feb4c-fd82-4293-9a44-a6f53816cdae" (UID: "929feb4c-fd82-4293-9a44-a6f53816cdae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.430990 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-dispersionconf\") pod \"929feb4c-fd82-4293-9a44-a6f53816cdae\" (UID: \"929feb4c-fd82-4293-9a44-a6f53816cdae\") " Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.431285 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/929feb4c-fd82-4293-9a44-a6f53816cdae-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.431298 4717 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/929feb4c-fd82-4293-9a44-a6f53816cdae-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.431308 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.431318 4717 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.431327 4717 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/929feb4c-fd82-4293-9a44-a6f53816cdae-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.431337 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svjk8\" (UniqueName: \"kubernetes.io/projected/929feb4c-fd82-4293-9a44-a6f53816cdae-kube-api-access-svjk8\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.436236 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "929feb4c-fd82-4293-9a44-a6f53816cdae" (UID: "929feb4c-fd82-4293-9a44-a6f53816cdae"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.532298 4717 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/929feb4c-fd82-4293-9a44-a6f53816cdae-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.569203 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6b77v" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.569150 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6b77v" event={"ID":"e74a323a-cd4e-435f-beaa-9d3b6689e98c","Type":"ContainerDied","Data":"d47123bb74a01f95ba3611aa903d22439c64303604d7a31aeb194f1424cbe773"} Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.569477 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d47123bb74a01f95ba3611aa903d22439c64303604d7a31aeb194f1424cbe773" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.573006 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rhzj6" event={"ID":"929feb4c-fd82-4293-9a44-a6f53816cdae","Type":"ContainerDied","Data":"e885b1e0db5d414dce376939bf5865f09c34fa5aa362130297227f704db3ac75"} Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.573037 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e885b1e0db5d414dce376939bf5865f09c34fa5aa362130297227f704db3ac75" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.573298 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rhzj6" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.575049 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vsfnm" event={"ID":"bd826039-5737-4f09-b722-e6263c314341","Type":"ContainerDied","Data":"61689a80ecb889fb0ff873c0029f47ab63f7158d6ae59b423668145778f18aad"} Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.575102 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61689a80ecb889fb0ff873c0029f47ab63f7158d6ae59b423668145778f18aad" Feb 17 15:11:06 crc kubenswrapper[4717]: I0217 15:11:06.575175 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vsfnm" Feb 17 15:11:07 crc kubenswrapper[4717]: I0217 15:11:07.586205 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hvpfc" event={"ID":"0b96c066-9919-4133-93df-69c9abdc0c6c","Type":"ContainerStarted","Data":"5e39d789f33c6404226590e54b1fffc1b8489430ebf7ad53c4a39530fdcd4a6d"} Feb 17 15:11:07 crc kubenswrapper[4717]: I0217 15:11:07.587623 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kvdwd" event={"ID":"35f16717-cc42-4465-8a1e-b7377b11b987","Type":"ContainerStarted","Data":"94b7e9b20e2fe255f7f53b2a331597b32c3774ac2a004642df653176a28a70b0"} Feb 17 15:11:07 crc kubenswrapper[4717]: I0217 15:11:07.605812 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hvpfc" podStartSLOduration=1.9910477119999999 podStartE2EDuration="18.605793713s" podCreationTimestamp="2026-02-17 15:10:49 +0000 UTC" firstStartedPulling="2026-02-17 15:10:50.119275828 +0000 UTC m=+1116.535116304" lastFinishedPulling="2026-02-17 15:11:06.734021829 +0000 UTC m=+1133.149862305" observedRunningTime="2026-02-17 15:11:07.599598777 +0000 UTC m=+1134.015439253" watchObservedRunningTime="2026-02-17 15:11:07.605793713 +0000 UTC m=+1134.021634189" Feb 17 15:11:07 crc kubenswrapper[4717]: I0217 15:11:07.618101 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kvdwd" podStartSLOduration=3.331425376 podStartE2EDuration="16.618065733s" podCreationTimestamp="2026-02-17 15:10:51 +0000 UTC" firstStartedPulling="2026-02-17 15:10:52.985543375 +0000 UTC m=+1119.401383851" lastFinishedPulling="2026-02-17 15:11:06.272183692 +0000 UTC m=+1132.688024208" observedRunningTime="2026-02-17 15:11:07.617779535 +0000 UTC m=+1134.033620051" watchObservedRunningTime="2026-02-17 15:11:07.618065733 +0000 UTC m=+1134.033906209" Feb 17 15:11:13 crc kubenswrapper[4717]: I0217 15:11:13.642749 4717 generic.go:334] "Generic (PLEG): container finished" podID="35f16717-cc42-4465-8a1e-b7377b11b987" containerID="94b7e9b20e2fe255f7f53b2a331597b32c3774ac2a004642df653176a28a70b0" exitCode=0 Feb 17 15:11:13 crc kubenswrapper[4717]: I0217 15:11:13.642836 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kvdwd" event={"ID":"35f16717-cc42-4465-8a1e-b7377b11b987","Type":"ContainerDied","Data":"94b7e9b20e2fe255f7f53b2a331597b32c3774ac2a004642df653176a28a70b0"} Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.024591 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.114620 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkn8s\" (UniqueName: \"kubernetes.io/projected/35f16717-cc42-4465-8a1e-b7377b11b987-kube-api-access-xkn8s\") pod \"35f16717-cc42-4465-8a1e-b7377b11b987\" (UID: \"35f16717-cc42-4465-8a1e-b7377b11b987\") " Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.114686 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f16717-cc42-4465-8a1e-b7377b11b987-config-data\") pod \"35f16717-cc42-4465-8a1e-b7377b11b987\" (UID: \"35f16717-cc42-4465-8a1e-b7377b11b987\") " Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.114963 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f16717-cc42-4465-8a1e-b7377b11b987-combined-ca-bundle\") pod \"35f16717-cc42-4465-8a1e-b7377b11b987\" (UID: \"35f16717-cc42-4465-8a1e-b7377b11b987\") " Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.122812 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f16717-cc42-4465-8a1e-b7377b11b987-kube-api-access-xkn8s" (OuterVolumeSpecName: "kube-api-access-xkn8s") pod "35f16717-cc42-4465-8a1e-b7377b11b987" (UID: "35f16717-cc42-4465-8a1e-b7377b11b987"). InnerVolumeSpecName "kube-api-access-xkn8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.142189 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f16717-cc42-4465-8a1e-b7377b11b987-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35f16717-cc42-4465-8a1e-b7377b11b987" (UID: "35f16717-cc42-4465-8a1e-b7377b11b987"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.169525 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f16717-cc42-4465-8a1e-b7377b11b987-config-data" (OuterVolumeSpecName: "config-data") pod "35f16717-cc42-4465-8a1e-b7377b11b987" (UID: "35f16717-cc42-4465-8a1e-b7377b11b987"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.216914 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkn8s\" (UniqueName: \"kubernetes.io/projected/35f16717-cc42-4465-8a1e-b7377b11b987-kube-api-access-xkn8s\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.216947 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f16717-cc42-4465-8a1e-b7377b11b987-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.216960 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f16717-cc42-4465-8a1e-b7377b11b987-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.664358 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kvdwd" event={"ID":"35f16717-cc42-4465-8a1e-b7377b11b987","Type":"ContainerDied","Data":"c2cf94b44f3bd9e0f48c889a4f50ef0e03dc8290eb0eef17fea793f1b8fe89cd"} Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.664404 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2cf94b44f3bd9e0f48c889a4f50ef0e03dc8290eb0eef17fea793f1b8fe89cd" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.664446 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kvdwd" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.895016 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-cx6dj"] Feb 17 15:11:15 crc kubenswrapper[4717]: E0217 15:11:15.895758 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="929feb4c-fd82-4293-9a44-a6f53816cdae" containerName="swift-ring-rebalance" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.895775 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="929feb4c-fd82-4293-9a44-a6f53816cdae" containerName="swift-ring-rebalance" Feb 17 15:11:15 crc kubenswrapper[4717]: E0217 15:11:15.895786 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf67c089-b17d-42bb-9cdf-3b5252b212c1" containerName="mariadb-account-create-update" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.895793 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf67c089-b17d-42bb-9cdf-3b5252b212c1" containerName="mariadb-account-create-update" Feb 17 15:11:15 crc kubenswrapper[4717]: E0217 15:11:15.895807 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44668081-3deb-40f0-a60e-302ee0a8b85a" containerName="mariadb-account-create-update" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.895813 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="44668081-3deb-40f0-a60e-302ee0a8b85a" containerName="mariadb-account-create-update" Feb 17 15:11:15 crc kubenswrapper[4717]: E0217 15:11:15.895828 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceff3034-f756-4cc9-9b21-3c38aed2b429" containerName="mariadb-database-create" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.895834 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceff3034-f756-4cc9-9b21-3c38aed2b429" containerName="mariadb-database-create" Feb 17 15:11:15 crc kubenswrapper[4717]: E0217 15:11:15.895848 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74a323a-cd4e-435f-beaa-9d3b6689e98c" containerName="mariadb-database-create" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.895856 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74a323a-cd4e-435f-beaa-9d3b6689e98c" containerName="mariadb-database-create" Feb 17 15:11:15 crc kubenswrapper[4717]: E0217 15:11:15.895887 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0a6ff8-9436-409a-b86f-df23c821c302" containerName="mariadb-account-create-update" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.895893 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0a6ff8-9436-409a-b86f-df23c821c302" containerName="mariadb-account-create-update" Feb 17 15:11:15 crc kubenswrapper[4717]: E0217 15:11:15.895904 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f16717-cc42-4465-8a1e-b7377b11b987" containerName="keystone-db-sync" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.895911 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f16717-cc42-4465-8a1e-b7377b11b987" containerName="keystone-db-sync" Feb 17 15:11:15 crc kubenswrapper[4717]: E0217 15:11:15.895921 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd826039-5737-4f09-b722-e6263c314341" containerName="mariadb-database-create" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.895927 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd826039-5737-4f09-b722-e6263c314341" containerName="mariadb-database-create" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.896118 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="929feb4c-fd82-4293-9a44-a6f53816cdae" containerName="swift-ring-rebalance" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.896131 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e74a323a-cd4e-435f-beaa-9d3b6689e98c" containerName="mariadb-database-create" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.896140 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf67c089-b17d-42bb-9cdf-3b5252b212c1" containerName="mariadb-account-create-update" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.896148 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="44668081-3deb-40f0-a60e-302ee0a8b85a" containerName="mariadb-account-create-update" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.896156 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0a6ff8-9436-409a-b86f-df23c821c302" containerName="mariadb-account-create-update" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.896163 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f16717-cc42-4465-8a1e-b7377b11b987" containerName="keystone-db-sync" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.896172 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd826039-5737-4f09-b722-e6263c314341" containerName="mariadb-database-create" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.896182 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceff3034-f756-4cc9-9b21-3c38aed2b429" containerName="mariadb-database-create" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.896997 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.915415 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-92w2x"] Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.916688 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.919437 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.919886 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.920131 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.920264 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.920421 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sdkz9" Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.943648 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-92w2x"] Feb 17 15:11:15 crc kubenswrapper[4717]: I0217 15:11:15.956455 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-cx6dj"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.033835 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-scripts\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.033917 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-fernet-keys\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.033999 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6fcb\" (UniqueName: \"kubernetes.io/projected/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-kube-api-access-j6fcb\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.034033 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-credential-keys\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.034058 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.034093 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlpz9\" (UniqueName: \"kubernetes.io/projected/4ef17b80-df4a-454a-9680-65dce7903b36-kube-api-access-rlpz9\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.034950 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-dns-svc\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.035967 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-combined-ca-bundle\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.036055 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-config\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.036120 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-config-data\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.036275 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.055176 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-894985b7-dwvdw"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.057673 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.069271 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.069496 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-ndcpm" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.070258 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.070405 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.078861 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-894985b7-dwvdw"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.085869 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8zrk2"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.091749 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.094208 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-25rgb" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.094414 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.094625 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.100237 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8zrk2"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140113 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6fcb\" (UniqueName: \"kubernetes.io/projected/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-kube-api-access-j6fcb\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140169 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-credential-keys\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140189 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140220 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlpz9\" (UniqueName: \"kubernetes.io/projected/4ef17b80-df4a-454a-9680-65dce7903b36-kube-api-access-rlpz9\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140239 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-dns-svc\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140267 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad0e548-d0dc-452c-b749-713a58712d11-logs\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140295 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ad0e548-d0dc-452c-b749-713a58712d11-scripts\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140314 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-combined-ca-bundle\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140335 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-config\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140358 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-config-data\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140373 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ad0e548-d0dc-452c-b749-713a58712d11-config-data\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140405 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ad0e548-d0dc-452c-b749-713a58712d11-horizon-secret-key\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140439 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140463 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-scripts\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140490 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg9qt\" (UniqueName: \"kubernetes.io/projected/0ad0e548-d0dc-452c-b749-713a58712d11-kube-api-access-cg9qt\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.140512 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-fernet-keys\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.152959 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.153109 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-dns-svc\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.154559 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-config\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.155217 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.158781 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-config-data\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.159808 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.163803 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-credential-keys\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.166811 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-fernet-keys\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.173192 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-combined-ca-bundle\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.174097 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-scripts\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.174425 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.178482 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.178660 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.182584 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlpz9\" (UniqueName: \"kubernetes.io/projected/4ef17b80-df4a-454a-9680-65dce7903b36-kube-api-access-rlpz9\") pod \"keystone-bootstrap-92w2x\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.173568 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-6ncb2"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.189593 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6ncb2"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.189742 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.211062 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6fcb\" (UniqueName: \"kubernetes.io/projected/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-kube-api-access-j6fcb\") pod \"dnsmasq-dns-f877ddd87-cx6dj\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.215188 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.216941 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vpgkk" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.225752 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.244606 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg9qt\" (UniqueName: \"kubernetes.io/projected/0ad0e548-d0dc-452c-b749-713a58712d11-kube-api-access-cg9qt\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254212 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-config-data\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254278 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-config\") pod \"neutron-db-sync-8zrk2\" (UID: \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\") " pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254311 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254369 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-scripts\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254405 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p2r8\" (UniqueName: \"kubernetes.io/projected/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-kube-api-access-2p2r8\") pod \"neutron-db-sync-8zrk2\" (UID: \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\") " pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254438 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6q6v\" (UniqueName: \"kubernetes.io/projected/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-kube-api-access-v6q6v\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254547 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254567 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-combined-ca-bundle\") pod \"neutron-db-sync-8zrk2\" (UID: \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\") " pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254603 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-run-httpd\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254630 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad0e548-d0dc-452c-b749-713a58712d11-logs\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254667 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ad0e548-d0dc-452c-b749-713a58712d11-scripts\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254716 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-log-httpd\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254745 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ad0e548-d0dc-452c-b749-713a58712d11-config-data\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.254810 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ad0e548-d0dc-452c-b749-713a58712d11-horizon-secret-key\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.250454 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.251680 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.261182 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad0e548-d0dc-452c-b749-713a58712d11-logs\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.262337 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ad0e548-d0dc-452c-b749-713a58712d11-scripts\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.265344 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ad0e548-d0dc-452c-b749-713a58712d11-config-data\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.270788 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ad0e548-d0dc-452c-b749-713a58712d11-horizon-secret-key\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.313102 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg9qt\" (UniqueName: \"kubernetes.io/projected/0ad0e548-d0dc-452c-b749-713a58712d11-kube-api-access-cg9qt\") pod \"horizon-894985b7-dwvdw\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.331589 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-s5lzg"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.332680 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.355888 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.355931 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-combined-ca-bundle\") pod \"neutron-db-sync-8zrk2\" (UID: \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\") " pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.355954 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-run-httpd\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.355995 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmc6f\" (UniqueName: \"kubernetes.io/projected/de8aa40e-4ead-46b8-a94b-0a3602d030ef-kube-api-access-qmc6f\") pod \"barbican-db-sync-6ncb2\" (UID: \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\") " pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.356026 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-log-httpd\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.356060 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de8aa40e-4ead-46b8-a94b-0a3602d030ef-db-sync-config-data\") pod \"barbican-db-sync-6ncb2\" (UID: \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\") " pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.356125 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8aa40e-4ead-46b8-a94b-0a3602d030ef-combined-ca-bundle\") pod \"barbican-db-sync-6ncb2\" (UID: \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\") " pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.356160 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-config-data\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.356181 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-config\") pod \"neutron-db-sync-8zrk2\" (UID: \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\") " pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.356199 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.356222 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-scripts\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.356760 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p2r8\" (UniqueName: \"kubernetes.io/projected/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-kube-api-access-2p2r8\") pod \"neutron-db-sync-8zrk2\" (UID: \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\") " pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.356790 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6q6v\" (UniqueName: \"kubernetes.io/projected/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-kube-api-access-v6q6v\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.359737 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bzltb" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.360040 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.360168 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.360951 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-log-httpd\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.361180 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-run-httpd\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.368918 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.370847 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.371425 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-scripts\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.374887 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-config\") pod \"neutron-db-sync-8zrk2\" (UID: \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\") " pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.382839 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-combined-ca-bundle\") pod \"neutron-db-sync-8zrk2\" (UID: \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\") " pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.383628 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.384299 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-config-data\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.444017 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6q6v\" (UniqueName: \"kubernetes.io/projected/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-kube-api-access-v6q6v\") pod \"ceilometer-0\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.457898 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmc6f\" (UniqueName: \"kubernetes.io/projected/de8aa40e-4ead-46b8-a94b-0a3602d030ef-kube-api-access-qmc6f\") pod \"barbican-db-sync-6ncb2\" (UID: \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\") " pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.457951 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de8aa40e-4ead-46b8-a94b-0a3602d030ef-db-sync-config-data\") pod \"barbican-db-sync-6ncb2\" (UID: \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\") " pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.457995 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-config-data\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.458016 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-db-sync-config-data\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.458039 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8aa40e-4ead-46b8-a94b-0a3602d030ef-combined-ca-bundle\") pod \"barbican-db-sync-6ncb2\" (UID: \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\") " pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.458061 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2klgj\" (UniqueName: \"kubernetes.io/projected/b9c9596c-6124-44d0-b06b-a99477938b79-kube-api-access-2klgj\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.458161 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c9596c-6124-44d0-b06b-a99477938b79-etc-machine-id\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.458185 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-scripts\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.458215 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-combined-ca-bundle\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.462933 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-s5lzg"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.477163 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fg4vl"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.477519 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de8aa40e-4ead-46b8-a94b-0a3602d030ef-db-sync-config-data\") pod \"barbican-db-sync-6ncb2\" (UID: \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\") " pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.478214 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p2r8\" (UniqueName: \"kubernetes.io/projected/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-kube-api-access-2p2r8\") pod \"neutron-db-sync-8zrk2\" (UID: \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\") " pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.478656 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8aa40e-4ead-46b8-a94b-0a3602d030ef-combined-ca-bundle\") pod \"barbican-db-sync-6ncb2\" (UID: \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\") " pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.478728 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.481562 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-v2976" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.495117 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.495296 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.509216 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmc6f\" (UniqueName: \"kubernetes.io/projected/de8aa40e-4ead-46b8-a94b-0a3602d030ef-kube-api-access-qmc6f\") pod \"barbican-db-sync-6ncb2\" (UID: \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\") " pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.527283 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6496446dcc-ktb8t"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.528694 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.560885 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-cx6dj"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.560920 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fg4vl"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.562322 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-scripts\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.562438 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-combined-ca-bundle\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.562537 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-config-data\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.562640 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c9596c-6124-44d0-b06b-a99477938b79-etc-machine-id\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.562733 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7055a012-2f5d-4ba2-b56d-a9ec73e11944-logs\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.562811 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-scripts\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.562900 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-combined-ca-bundle\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.563005 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thgx7\" (UniqueName: \"kubernetes.io/projected/7055a012-2f5d-4ba2-b56d-a9ec73e11944-kube-api-access-thgx7\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.563160 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-config-data\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.568712 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-db-sync-config-data\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.568874 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2klgj\" (UniqueName: \"kubernetes.io/projected/b9c9596c-6124-44d0-b06b-a99477938b79-kube-api-access-2klgj\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.572340 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c9596c-6124-44d0-b06b-a99477938b79-etc-machine-id\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.574523 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6496446dcc-ktb8t"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.586068 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-config-data\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.603899 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-scripts\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.603982 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-nm8xp"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.606478 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-combined-ca-bundle\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.608390 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.613034 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2klgj\" (UniqueName: \"kubernetes.io/projected/b9c9596c-6124-44d0-b06b-a99477938b79-kube-api-access-2klgj\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.624172 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-nm8xp"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.650511 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-db-sync-config-data\") pod \"cinder-db-sync-s5lzg\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.670200 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thgx7\" (UniqueName: \"kubernetes.io/projected/7055a012-2f5d-4ba2-b56d-a9ec73e11944-kube-api-access-thgx7\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.670278 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckcs2\" (UniqueName: \"kubernetes.io/projected/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-kube-api-access-ckcs2\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.670313 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-scripts\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.670338 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-combined-ca-bundle\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.670365 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-config-data\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.670388 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-config-data\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.670431 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7055a012-2f5d-4ba2-b56d-a9ec73e11944-logs\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.670464 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-scripts\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.670489 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-logs\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.670508 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-horizon-secret-key\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.671158 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7055a012-2f5d-4ba2-b56d-a9ec73e11944-logs\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.674783 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-combined-ca-bundle\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.676089 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-scripts\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.682524 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-config-data\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.703747 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thgx7\" (UniqueName: \"kubernetes.io/projected/7055a012-2f5d-4ba2-b56d-a9ec73e11944-kube-api-access-thgx7\") pod \"placement-db-sync-fg4vl\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.717044 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.724505 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.772117 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-cx6dj"] Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.773951 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckcs2\" (UniqueName: \"kubernetes.io/projected/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-kube-api-access-ckcs2\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.774020 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-config\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.774886 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-config-data\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.774968 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.775055 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-scripts\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.776040 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-logs\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.776074 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-horizon-secret-key\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.776129 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggl4k\" (UniqueName: \"kubernetes.io/projected/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-kube-api-access-ggl4k\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.776259 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.776285 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.778376 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-logs\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.778658 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-config-data\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.778741 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-scripts\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.780014 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.791522 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-horizon-secret-key\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.794575 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckcs2\" (UniqueName: \"kubernetes.io/projected/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-kube-api-access-ckcs2\") pod \"horizon-6496446dcc-ktb8t\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.884316 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.884432 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggl4k\" (UniqueName: \"kubernetes.io/projected/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-kube-api-access-ggl4k\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.884521 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.884544 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.884618 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-config\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.885373 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-config\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.885898 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.885942 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.888882 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.925196 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.929352 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggl4k\" (UniqueName: \"kubernetes.io/projected/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-kube-api-access-ggl4k\") pod \"dnsmasq-dns-68dcc9cf6f-nm8xp\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.963578 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fg4vl" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.981801 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:16 crc kubenswrapper[4717]: I0217 15:11:16.987532 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.088242 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-894985b7-dwvdw"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.210853 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-92w2x"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.225980 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8zrk2"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.329313 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.486133 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6ncb2"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.634532 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fg4vl"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.652138 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-s5lzg"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.725343 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6ncb2" event={"ID":"de8aa40e-4ead-46b8-a94b-0a3602d030ef","Type":"ContainerStarted","Data":"ac49e956de028d3062244d4a3ecd741f99e3bc7bda0918e1f2939ad253194131"} Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.727370 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-nm8xp"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.730326 4717 generic.go:334] "Generic (PLEG): container finished" podID="4cdb9cb3-dc99-41a1-9197-36ca43ac52a0" containerID="b4e618c924f2a8281988eab68750102a52f26ce5f3877209794274f825ac2966" exitCode=0 Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.730563 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" event={"ID":"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0","Type":"ContainerDied","Data":"b4e618c924f2a8281988eab68750102a52f26ce5f3877209794274f825ac2966"} Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.730593 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" event={"ID":"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0","Type":"ContainerStarted","Data":"8200ad51765ad13c9bd93319065ebe7c79ceff8eebbbc8d435ee17f86aefa987"} Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.738362 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1","Type":"ContainerStarted","Data":"711b964646b2722c68c605a6aa40b482da7cad7bb4fb503552f432a7c9a2d6a2"} Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.748349 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-894985b7-dwvdw" event={"ID":"0ad0e548-d0dc-452c-b749-713a58712d11","Type":"ContainerStarted","Data":"69a4b485011444c5052601b81117165df4203e6d631eeee4943a706686cadcd1"} Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.752797 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6496446dcc-ktb8t"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.781910 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b6b86dfcc-ptxwk"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.783415 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.788626 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b6b86dfcc-ptxwk"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.795295 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.809069 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fg4vl" event={"ID":"7055a012-2f5d-4ba2-b56d-a9ec73e11944","Type":"ContainerStarted","Data":"24e453bb9c17cb29fc16601523d5ab38c4e4deb98e91ebc82bfdd93d9a64b7d4"} Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.827178 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6496446dcc-ktb8t"] Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.828060 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8zrk2" event={"ID":"246d9375-0f70-4c31-ac82-63ecc1bdcd2b","Type":"ContainerStarted","Data":"ec74d17aa82c8980ca1c5457c5c15d213e2abfa96f837f151625e09426ab161a"} Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.828115 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8zrk2" event={"ID":"246d9375-0f70-4c31-ac82-63ecc1bdcd2b","Type":"ContainerStarted","Data":"e496572e458a84d608255d35802df5adcacea22182dc87a1ea4ba0fdc882fed7"} Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.835370 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-92w2x" event={"ID":"4ef17b80-df4a-454a-9680-65dce7903b36","Type":"ContainerStarted","Data":"0ec438df1e3fc332f6f924cc4a302b0afd151ca5713495ba5c9b9a30c2ccf935"} Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.835412 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-92w2x" event={"ID":"4ef17b80-df4a-454a-9680-65dce7903b36","Type":"ContainerStarted","Data":"e92a55c53ff88ecaaa5da3ea2ba34a277ce75be34493d74f2e7b8e0d761bc276"} Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.841695 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s5lzg" event={"ID":"b9c9596c-6124-44d0-b06b-a99477938b79","Type":"ContainerStarted","Data":"bb62fc292e03df68e9c8f52e8a06201e030456044052f9076e355b856862eddc"} Feb 17 15:11:17 crc kubenswrapper[4717]: W0217 15:11:17.842291 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a8a341e_2b01_4c3d_b65f_5af4fa9c8702.slice/crio-ea2cfae92f3b02f26ff831fdb8e1a84cd11f5a87bbba9658181ed2832a6371e9 WatchSource:0}: Error finding container ea2cfae92f3b02f26ff831fdb8e1a84cd11f5a87bbba9658181ed2832a6371e9: Status 404 returned error can't find the container with id ea2cfae92f3b02f26ff831fdb8e1a84cd11f5a87bbba9658181ed2832a6371e9 Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.878170 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-8zrk2" podStartSLOduration=1.878145824 podStartE2EDuration="1.878145824s" podCreationTimestamp="2026-02-17 15:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:11:17.849428206 +0000 UTC m=+1144.265268692" watchObservedRunningTime="2026-02-17 15:11:17.878145824 +0000 UTC m=+1144.293986300" Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.899456 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-92w2x" podStartSLOduration=2.899438621 podStartE2EDuration="2.899438621s" podCreationTimestamp="2026-02-17 15:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:11:17.868303814 +0000 UTC m=+1144.284144280" watchObservedRunningTime="2026-02-17 15:11:17.899438621 +0000 UTC m=+1144.315279097" Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.919620 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-horizon-secret-key\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.919738 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqj5m\" (UniqueName: \"kubernetes.io/projected/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-kube-api-access-zqj5m\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.919764 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-scripts\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.919801 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-config-data\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:17 crc kubenswrapper[4717]: I0217 15:11:17.919839 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-logs\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.022120 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqj5m\" (UniqueName: \"kubernetes.io/projected/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-kube-api-access-zqj5m\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.022168 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-scripts\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.022239 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-config-data\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.022288 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-logs\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.022343 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-horizon-secret-key\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.023256 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-logs\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.023457 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-scripts\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.024665 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-config-data\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.032970 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-horizon-secret-key\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.040114 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqj5m\" (UniqueName: \"kubernetes.io/projected/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-kube-api-access-zqj5m\") pod \"horizon-7b6b86dfcc-ptxwk\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.071434 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.133060 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.225839 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-ovsdbserver-nb\") pod \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.225931 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-dns-svc\") pod \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.226509 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-ovsdbserver-sb\") pod \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.226605 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-config\") pod \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.226706 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6fcb\" (UniqueName: \"kubernetes.io/projected/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-kube-api-access-j6fcb\") pod \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\" (UID: \"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0\") " Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.230879 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-kube-api-access-j6fcb" (OuterVolumeSpecName: "kube-api-access-j6fcb") pod "4cdb9cb3-dc99-41a1-9197-36ca43ac52a0" (UID: "4cdb9cb3-dc99-41a1-9197-36ca43ac52a0"). InnerVolumeSpecName "kube-api-access-j6fcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.249936 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4cdb9cb3-dc99-41a1-9197-36ca43ac52a0" (UID: "4cdb9cb3-dc99-41a1-9197-36ca43ac52a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.256888 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cdb9cb3-dc99-41a1-9197-36ca43ac52a0" (UID: "4cdb9cb3-dc99-41a1-9197-36ca43ac52a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.256900 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-config" (OuterVolumeSpecName: "config") pod "4cdb9cb3-dc99-41a1-9197-36ca43ac52a0" (UID: "4cdb9cb3-dc99-41a1-9197-36ca43ac52a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.263247 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4cdb9cb3-dc99-41a1-9197-36ca43ac52a0" (UID: "4cdb9cb3-dc99-41a1-9197-36ca43ac52a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.328215 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.328240 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.328251 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.328259 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6fcb\" (UniqueName: \"kubernetes.io/projected/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-kube-api-access-j6fcb\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.328270 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.678814 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b6b86dfcc-ptxwk"] Feb 17 15:11:18 crc kubenswrapper[4717]: W0217 15:11:18.708870 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabe2c2e7_c33b_44bf_9b73_7bf824f98c2b.slice/crio-4533decd9eb40118d0bcc085cfce679bf43cd04f28ce586c0418f71046e9474c WatchSource:0}: Error finding container 4533decd9eb40118d0bcc085cfce679bf43cd04f28ce586c0418f71046e9474c: Status 404 returned error can't find the container with id 4533decd9eb40118d0bcc085cfce679bf43cd04f28ce586c0418f71046e9474c Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.861268 4717 generic.go:334] "Generic (PLEG): container finished" podID="ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" containerID="af2920544484512942f30b3deaa6859c6b4094ddbfdc0fa8712cd66cca946fce" exitCode=0 Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.861356 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" event={"ID":"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf","Type":"ContainerDied","Data":"af2920544484512942f30b3deaa6859c6b4094ddbfdc0fa8712cd66cca946fce"} Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.861390 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" event={"ID":"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf","Type":"ContainerStarted","Data":"f57eba55ebd2a933abdd06cf55eb504eaa77e5d70d50f6f2fcf4a1abab56249d"} Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.866053 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6496446dcc-ktb8t" event={"ID":"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702","Type":"ContainerStarted","Data":"ea2cfae92f3b02f26ff831fdb8e1a84cd11f5a87bbba9658181ed2832a6371e9"} Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.869771 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b6b86dfcc-ptxwk" event={"ID":"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b","Type":"ContainerStarted","Data":"4533decd9eb40118d0bcc085cfce679bf43cd04f28ce586c0418f71046e9474c"} Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.877012 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.878324 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-cx6dj" event={"ID":"4cdb9cb3-dc99-41a1-9197-36ca43ac52a0","Type":"ContainerDied","Data":"8200ad51765ad13c9bd93319065ebe7c79ceff8eebbbc8d435ee17f86aefa987"} Feb 17 15:11:18 crc kubenswrapper[4717]: I0217 15:11:18.878402 4717 scope.go:117] "RemoveContainer" containerID="b4e618c924f2a8281988eab68750102a52f26ce5f3877209794274f825ac2966" Feb 17 15:11:19 crc kubenswrapper[4717]: I0217 15:11:19.027141 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-cx6dj"] Feb 17 15:11:19 crc kubenswrapper[4717]: I0217 15:11:19.037617 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-cx6dj"] Feb 17 15:11:19 crc kubenswrapper[4717]: I0217 15:11:19.870216 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cdb9cb3-dc99-41a1-9197-36ca43ac52a0" path="/var/lib/kubelet/pods/4cdb9cb3-dc99-41a1-9197-36ca43ac52a0/volumes" Feb 17 15:11:19 crc kubenswrapper[4717]: I0217 15:11:19.919208 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" event={"ID":"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf","Type":"ContainerStarted","Data":"11b5defdab6b7391649212a44d32ee62f7660f387b887af49b09b3e843a3c95e"} Feb 17 15:11:19 crc kubenswrapper[4717]: I0217 15:11:19.920633 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:19 crc kubenswrapper[4717]: I0217 15:11:19.944898 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" podStartSLOduration=3.944875567 podStartE2EDuration="3.944875567s" podCreationTimestamp="2026-02-17 15:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:11:19.936944061 +0000 UTC m=+1146.352784547" watchObservedRunningTime="2026-02-17 15:11:19.944875567 +0000 UTC m=+1146.360716043" Feb 17 15:11:20 crc kubenswrapper[4717]: I0217 15:11:20.929160 4717 generic.go:334] "Generic (PLEG): container finished" podID="0b96c066-9919-4133-93df-69c9abdc0c6c" containerID="5e39d789f33c6404226590e54b1fffc1b8489430ebf7ad53c4a39530fdcd4a6d" exitCode=0 Feb 17 15:11:20 crc kubenswrapper[4717]: I0217 15:11:20.929197 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hvpfc" event={"ID":"0b96c066-9919-4133-93df-69c9abdc0c6c","Type":"ContainerDied","Data":"5e39d789f33c6404226590e54b1fffc1b8489430ebf7ad53c4a39530fdcd4a6d"} Feb 17 15:11:21 crc kubenswrapper[4717]: I0217 15:11:21.700979 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:11:21 crc kubenswrapper[4717]: I0217 15:11:21.710487 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/518c6b94-614f-42fd-9016-122cdcfcb8c9-etc-swift\") pod \"swift-storage-0\" (UID: \"518c6b94-614f-42fd-9016-122cdcfcb8c9\") " pod="openstack/swift-storage-0" Feb 17 15:11:21 crc kubenswrapper[4717]: I0217 15:11:21.872633 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.632719 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-894985b7-dwvdw"] Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.667572 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b88fd5cc6-dqjmc"] Feb 17 15:11:24 crc kubenswrapper[4717]: E0217 15:11:24.667947 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cdb9cb3-dc99-41a1-9197-36ca43ac52a0" containerName="init" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.667960 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cdb9cb3-dc99-41a1-9197-36ca43ac52a0" containerName="init" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.668151 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cdb9cb3-dc99-41a1-9197-36ca43ac52a0" containerName="init" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.672572 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.680224 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b88fd5cc6-dqjmc"] Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.682228 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.732971 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b6b86dfcc-ptxwk"] Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.769144 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-85b46995b-rj5bq"] Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.770851 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784161 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-horizon-tls-certs\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784215 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj6tq\" (UniqueName: \"kubernetes.io/projected/04bb64de-6640-4f6a-9052-ff0edf9dacb8-kube-api-access-gj6tq\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784243 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/946c8c31-01d1-45f7-87c2-a022100aeef9-scripts\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784552 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-horizon-secret-key\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784587 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/946c8c31-01d1-45f7-87c2-a022100aeef9-config-data\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784606 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/946c8c31-01d1-45f7-87c2-a022100aeef9-logs\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784636 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04bb64de-6640-4f6a-9052-ff0edf9dacb8-config-data\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784684 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-combined-ca-bundle\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784713 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04bb64de-6640-4f6a-9052-ff0edf9dacb8-scripts\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784746 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wf4p\" (UniqueName: \"kubernetes.io/projected/946c8c31-01d1-45f7-87c2-a022100aeef9-kube-api-access-4wf4p\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784763 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bb64de-6640-4f6a-9052-ff0edf9dacb8-logs\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784816 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04bb64de-6640-4f6a-9052-ff0edf9dacb8-horizon-secret-key\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784836 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bb64de-6640-4f6a-9052-ff0edf9dacb8-combined-ca-bundle\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.784858 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/04bb64de-6640-4f6a-9052-ff0edf9dacb8-horizon-tls-certs\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.786421 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85b46995b-rj5bq"] Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.887521 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-horizon-secret-key\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.887587 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/946c8c31-01d1-45f7-87c2-a022100aeef9-config-data\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.887615 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/946c8c31-01d1-45f7-87c2-a022100aeef9-logs\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.887655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04bb64de-6640-4f6a-9052-ff0edf9dacb8-config-data\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.887722 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-combined-ca-bundle\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.887762 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04bb64de-6640-4f6a-9052-ff0edf9dacb8-scripts\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.887821 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wf4p\" (UniqueName: \"kubernetes.io/projected/946c8c31-01d1-45f7-87c2-a022100aeef9-kube-api-access-4wf4p\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.887845 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bb64de-6640-4f6a-9052-ff0edf9dacb8-logs\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.887894 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04bb64de-6640-4f6a-9052-ff0edf9dacb8-horizon-secret-key\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.887919 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bb64de-6640-4f6a-9052-ff0edf9dacb8-combined-ca-bundle\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.887949 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/04bb64de-6640-4f6a-9052-ff0edf9dacb8-horizon-tls-certs\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.887987 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-horizon-tls-certs\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.888000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/946c8c31-01d1-45f7-87c2-a022100aeef9-logs\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.888023 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj6tq\" (UniqueName: \"kubernetes.io/projected/04bb64de-6640-4f6a-9052-ff0edf9dacb8-kube-api-access-gj6tq\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.888059 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/946c8c31-01d1-45f7-87c2-a022100aeef9-scripts\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.888852 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bb64de-6640-4f6a-9052-ff0edf9dacb8-logs\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.889267 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04bb64de-6640-4f6a-9052-ff0edf9dacb8-config-data\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.889450 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04bb64de-6640-4f6a-9052-ff0edf9dacb8-scripts\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.890899 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/946c8c31-01d1-45f7-87c2-a022100aeef9-scripts\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.891125 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/946c8c31-01d1-45f7-87c2-a022100aeef9-config-data\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.902776 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-combined-ca-bundle\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.902821 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bb64de-6640-4f6a-9052-ff0edf9dacb8-combined-ca-bundle\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.904303 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-horizon-tls-certs\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.905469 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/04bb64de-6640-4f6a-9052-ff0edf9dacb8-horizon-tls-certs\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.905961 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-horizon-secret-key\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.909813 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj6tq\" (UniqueName: \"kubernetes.io/projected/04bb64de-6640-4f6a-9052-ff0edf9dacb8-kube-api-access-gj6tq\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.910724 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wf4p\" (UniqueName: \"kubernetes.io/projected/946c8c31-01d1-45f7-87c2-a022100aeef9-kube-api-access-4wf4p\") pod \"horizon-7b88fd5cc6-dqjmc\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:24 crc kubenswrapper[4717]: I0217 15:11:24.918319 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04bb64de-6640-4f6a-9052-ff0edf9dacb8-horizon-secret-key\") pod \"horizon-85b46995b-rj5bq\" (UID: \"04bb64de-6640-4f6a-9052-ff0edf9dacb8\") " pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:25 crc kubenswrapper[4717]: I0217 15:11:25.014819 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:11:25 crc kubenswrapper[4717]: I0217 15:11:25.085101 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:11:26 crc kubenswrapper[4717]: I0217 15:11:26.989163 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:11:27 crc kubenswrapper[4717]: I0217 15:11:27.069487 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-g7mg8"] Feb 17 15:11:27 crc kubenswrapper[4717]: I0217 15:11:27.069914 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-g7mg8" podUID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" containerName="dnsmasq-dns" containerID="cri-o://15cc8d81e9ea799a7d93cb54ecc804873c28505e89516613adffc06a18ed205f" gracePeriod=10 Feb 17 15:11:29 crc kubenswrapper[4717]: I0217 15:11:29.025593 4717 generic.go:334] "Generic (PLEG): container finished" podID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" containerID="15cc8d81e9ea799a7d93cb54ecc804873c28505e89516613adffc06a18ed205f" exitCode=0 Feb 17 15:11:29 crc kubenswrapper[4717]: I0217 15:11:29.025617 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-g7mg8" event={"ID":"29c6b382-3315-4904-8fd9-dc7f1c993c2b","Type":"ContainerDied","Data":"15cc8d81e9ea799a7d93cb54ecc804873c28505e89516613adffc06a18ed205f"} Feb 17 15:11:31 crc kubenswrapper[4717]: I0217 15:11:31.041553 4717 generic.go:334] "Generic (PLEG): container finished" podID="4ef17b80-df4a-454a-9680-65dce7903b36" containerID="0ec438df1e3fc332f6f924cc4a302b0afd151ca5713495ba5c9b9a30c2ccf935" exitCode=0 Feb 17 15:11:31 crc kubenswrapper[4717]: I0217 15:11:31.041830 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-92w2x" event={"ID":"4ef17b80-df4a-454a-9680-65dce7903b36","Type":"ContainerDied","Data":"0ec438df1e3fc332f6f924cc4a302b0afd151ca5713495ba5c9b9a30c2ccf935"} Feb 17 15:11:31 crc kubenswrapper[4717]: E0217 15:11:31.309421 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 17 15:11:31 crc kubenswrapper[4717]: E0217 15:11:31.309797 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n678h564h644hcch5fbh9ch667h656h658h55ch64bh544h5bh676h8bh597h5d6h68fh65dhd5hc9h5bch676h649h586h5fdh6dh6dhcch594h574h5fdq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v6q6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:11:31 crc kubenswrapper[4717]: I0217 15:11:31.812651 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-g7mg8" podUID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Feb 17 15:11:32 crc kubenswrapper[4717]: E0217 15:11:32.764803 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Feb 17 15:11:32 crc kubenswrapper[4717]: E0217 15:11:32.766095 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-thgx7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-fg4vl_openstack(7055a012-2f5d-4ba2-b56d-a9ec73e11944): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:11:32 crc kubenswrapper[4717]: E0217 15:11:32.768172 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-fg4vl" podUID="7055a012-2f5d-4ba2-b56d-a9ec73e11944" Feb 17 15:11:32 crc kubenswrapper[4717]: I0217 15:11:32.857987 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hvpfc" Feb 17 15:11:32 crc kubenswrapper[4717]: I0217 15:11:32.953548 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-config-data\") pod \"0b96c066-9919-4133-93df-69c9abdc0c6c\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " Feb 17 15:11:32 crc kubenswrapper[4717]: I0217 15:11:32.954621 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-combined-ca-bundle\") pod \"0b96c066-9919-4133-93df-69c9abdc0c6c\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " Feb 17 15:11:32 crc kubenswrapper[4717]: I0217 15:11:32.954687 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vfh2\" (UniqueName: \"kubernetes.io/projected/0b96c066-9919-4133-93df-69c9abdc0c6c-kube-api-access-4vfh2\") pod \"0b96c066-9919-4133-93df-69c9abdc0c6c\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " Feb 17 15:11:32 crc kubenswrapper[4717]: I0217 15:11:32.954784 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-db-sync-config-data\") pod \"0b96c066-9919-4133-93df-69c9abdc0c6c\" (UID: \"0b96c066-9919-4133-93df-69c9abdc0c6c\") " Feb 17 15:11:32 crc kubenswrapper[4717]: I0217 15:11:32.959801 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0b96c066-9919-4133-93df-69c9abdc0c6c" (UID: "0b96c066-9919-4133-93df-69c9abdc0c6c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:32 crc kubenswrapper[4717]: I0217 15:11:32.966516 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b96c066-9919-4133-93df-69c9abdc0c6c-kube-api-access-4vfh2" (OuterVolumeSpecName: "kube-api-access-4vfh2") pod "0b96c066-9919-4133-93df-69c9abdc0c6c" (UID: "0b96c066-9919-4133-93df-69c9abdc0c6c"). InnerVolumeSpecName "kube-api-access-4vfh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:32 crc kubenswrapper[4717]: I0217 15:11:32.991225 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b96c066-9919-4133-93df-69c9abdc0c6c" (UID: "0b96c066-9919-4133-93df-69c9abdc0c6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:33 crc kubenswrapper[4717]: I0217 15:11:33.002339 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-config-data" (OuterVolumeSpecName: "config-data") pod "0b96c066-9919-4133-93df-69c9abdc0c6c" (UID: "0b96c066-9919-4133-93df-69c9abdc0c6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:33 crc kubenswrapper[4717]: I0217 15:11:33.057662 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:33 crc kubenswrapper[4717]: I0217 15:11:33.057698 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vfh2\" (UniqueName: \"kubernetes.io/projected/0b96c066-9919-4133-93df-69c9abdc0c6c-kube-api-access-4vfh2\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:33 crc kubenswrapper[4717]: I0217 15:11:33.057715 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:33 crc kubenswrapper[4717]: I0217 15:11:33.057728 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b96c066-9919-4133-93df-69c9abdc0c6c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:33 crc kubenswrapper[4717]: I0217 15:11:33.061126 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hvpfc" Feb 17 15:11:33 crc kubenswrapper[4717]: I0217 15:11:33.061256 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hvpfc" event={"ID":"0b96c066-9919-4133-93df-69c9abdc0c6c","Type":"ContainerDied","Data":"c5100cd04039cf9f5c4e9b024412ef27d7a094f8871576ee15700b4fd88ef6b0"} Feb 17 15:11:33 crc kubenswrapper[4717]: I0217 15:11:33.061305 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5100cd04039cf9f5c4e9b024412ef27d7a094f8871576ee15700b4fd88ef6b0" Feb 17 15:11:33 crc kubenswrapper[4717]: E0217 15:11:33.061905 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-fg4vl" podUID="7055a012-2f5d-4ba2-b56d-a9ec73e11944" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.254889 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-54rqc"] Feb 17 15:11:34 crc kubenswrapper[4717]: E0217 15:11:34.255654 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b96c066-9919-4133-93df-69c9abdc0c6c" containerName="glance-db-sync" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.255667 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b96c066-9919-4133-93df-69c9abdc0c6c" containerName="glance-db-sync" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.255829 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b96c066-9919-4133-93df-69c9abdc0c6c" containerName="glance-db-sync" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.264326 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.280744 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.280809 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.280871 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-config\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.280907 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-dns-svc\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.280966 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb9wk\" (UniqueName: \"kubernetes.io/projected/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-kube-api-access-cb9wk\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.308553 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-54rqc"] Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.383985 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.384055 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.384135 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-config\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.384176 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-dns-svc\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.384238 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb9wk\" (UniqueName: \"kubernetes.io/projected/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-kube-api-access-cb9wk\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.385225 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.385348 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-config\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.385949 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.386021 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-dns-svc\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.421339 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb9wk\" (UniqueName: \"kubernetes.io/projected/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-kube-api-access-cb9wk\") pod \"dnsmasq-dns-f84976bdf-54rqc\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:34 crc kubenswrapper[4717]: I0217 15:11:34.587815 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.300976 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.302938 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.305480 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dgrmv" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.305574 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.305634 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.321467 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.403827 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.404364 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-scripts\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.404417 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.405032 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gfvl\" (UniqueName: \"kubernetes.io/projected/e20b1f7a-2627-4bde-9e82-d458ed9ba797-kube-api-access-4gfvl\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.405092 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e20b1f7a-2627-4bde-9e82-d458ed9ba797-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.405234 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e20b1f7a-2627-4bde-9e82-d458ed9ba797-logs\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.405263 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-config-data\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.472327 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.473779 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.475748 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.489412 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.507645 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-scripts\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.507736 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.507762 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gfvl\" (UniqueName: \"kubernetes.io/projected/e20b1f7a-2627-4bde-9e82-d458ed9ba797-kube-api-access-4gfvl\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.507783 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e20b1f7a-2627-4bde-9e82-d458ed9ba797-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.507900 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e20b1f7a-2627-4bde-9e82-d458ed9ba797-logs\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.507926 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-config-data\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.507977 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.509805 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e20b1f7a-2627-4bde-9e82-d458ed9ba797-logs\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.509869 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e20b1f7a-2627-4bde-9e82-d458ed9ba797-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.510248 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.513985 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.519473 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-config-data\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.533221 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-scripts\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.534665 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gfvl\" (UniqueName: \"kubernetes.io/projected/e20b1f7a-2627-4bde-9e82-d458ed9ba797-kube-api-access-4gfvl\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.564263 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.609895 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/619aaea4-acc6-488e-bb45-6406861b48db-logs\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.609984 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-scripts\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.610006 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.610025 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.610053 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gknvc\" (UniqueName: \"kubernetes.io/projected/619aaea4-acc6-488e-bb45-6406861b48db-kube-api-access-gknvc\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.610103 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/619aaea4-acc6-488e-bb45-6406861b48db-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.610128 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-config-data\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.623696 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.711305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-scripts\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.711355 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.711378 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.711424 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gknvc\" (UniqueName: \"kubernetes.io/projected/619aaea4-acc6-488e-bb45-6406861b48db-kube-api-access-gknvc\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.711469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/619aaea4-acc6-488e-bb45-6406861b48db-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.711497 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-config-data\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.711547 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/619aaea4-acc6-488e-bb45-6406861b48db-logs\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.712013 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.712098 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/619aaea4-acc6-488e-bb45-6406861b48db-logs\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.712306 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/619aaea4-acc6-488e-bb45-6406861b48db-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.725257 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.726852 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-config-data\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.731012 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-scripts\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.735832 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gknvc\" (UniqueName: \"kubernetes.io/projected/619aaea4-acc6-488e-bb45-6406861b48db-kube-api-access-gknvc\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.760380 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: I0217 15:11:35.799502 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 15:11:35 crc kubenswrapper[4717]: E0217 15:11:35.993738 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 17 15:11:35 crc kubenswrapper[4717]: E0217 15:11:35.994011 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmc6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-6ncb2_openstack(de8aa40e-4ead-46b8-a94b-0a3602d030ef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:11:35 crc kubenswrapper[4717]: E0217 15:11:35.995167 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-6ncb2" podUID="de8aa40e-4ead-46b8-a94b-0a3602d030ef" Feb 17 15:11:36 crc kubenswrapper[4717]: E0217 15:11:36.093454 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-6ncb2" podUID="de8aa40e-4ead-46b8-a94b-0a3602d030ef" Feb 17 15:11:37 crc kubenswrapper[4717]: I0217 15:11:37.111644 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:11:37 crc kubenswrapper[4717]: I0217 15:11:37.203915 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:11:41 crc kubenswrapper[4717]: E0217 15:11:41.393153 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 17 15:11:41 crc kubenswrapper[4717]: E0217 15:11:41.394238 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8h67bh96h58dh59bh596h548h54ch69h695h5b9h597hf4hc5h5ddh68dh5c6h689h654h698hdchb4h5d4h588h675h5c4h699h5dfh576h65dh68ch5fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ckcs2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6496446dcc-ktb8t_openstack(9a8a341e-2b01-4c3d-b65f-5af4fa9c8702): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:11:41 crc kubenswrapper[4717]: E0217 15:11:41.396849 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6496446dcc-ktb8t" podUID="9a8a341e-2b01-4c3d-b65f-5af4fa9c8702" Feb 17 15:11:41 crc kubenswrapper[4717]: I0217 15:11:41.814003 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-g7mg8" podUID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Feb 17 15:11:46 crc kubenswrapper[4717]: I0217 15:11:46.197862 4717 generic.go:334] "Generic (PLEG): container finished" podID="246d9375-0f70-4c31-ac82-63ecc1bdcd2b" containerID="ec74d17aa82c8980ca1c5457c5c15d213e2abfa96f837f151625e09426ab161a" exitCode=0 Feb 17 15:11:46 crc kubenswrapper[4717]: I0217 15:11:46.197962 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8zrk2" event={"ID":"246d9375-0f70-4c31-ac82-63ecc1bdcd2b","Type":"ContainerDied","Data":"ec74d17aa82c8980ca1c5457c5c15d213e2abfa96f837f151625e09426ab161a"} Feb 17 15:11:46 crc kubenswrapper[4717]: I0217 15:11:46.814682 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-g7mg8" podUID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Feb 17 15:11:46 crc kubenswrapper[4717]: I0217 15:11:46.815020 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:11:51 crc kubenswrapper[4717]: E0217 15:11:51.466287 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 17 15:11:51 crc kubenswrapper[4717]: E0217 15:11:51.467203 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7dh5d8h54ch9ch674h655h6bhc4h5b5hfdh594h567hf4h66fhd7h58h654h5c6hc6h5dch5f8hdch566h76h55bh96h669h5d5h5dbhddh68dhcbq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zqj5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7b6b86dfcc-ptxwk_openstack(abe2c2e7-c33b-44bf-9b73-7bf824f98c2b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:11:51 crc kubenswrapper[4717]: E0217 15:11:51.475428 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7b6b86dfcc-ptxwk" podUID="abe2c2e7-c33b-44bf-9b73-7bf824f98c2b" Feb 17 15:11:51 crc kubenswrapper[4717]: E0217 15:11:51.519303 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 17 15:11:51 crc kubenswrapper[4717]: E0217 15:11:51.519621 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b6h674h579h76h58chd7h644h5f5hcbh598hd8h575h699hb6h689h546h556h597h5f6h5d8h68h594hbch57fh68dh694h585h65ch6dh8dhf5hbcq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cg9qt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-894985b7-dwvdw_openstack(0ad0e548-d0dc-452c-b749-713a58712d11): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:11:51 crc kubenswrapper[4717]: E0217 15:11:51.523706 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-894985b7-dwvdw" podUID="0ad0e548-d0dc-452c-b749-713a58712d11" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.592183 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.603408 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.621177 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.624554 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.747945 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-scripts\") pod \"4ef17b80-df4a-454a-9680-65dce7903b36\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748003 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vch7q\" (UniqueName: \"kubernetes.io/projected/29c6b382-3315-4904-8fd9-dc7f1c993c2b-kube-api-access-vch7q\") pod \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748052 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-logs\") pod \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748107 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-config\") pod \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\" (UID: \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748144 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-fernet-keys\") pod \"4ef17b80-df4a-454a-9680-65dce7903b36\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748205 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-config\") pod \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748231 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-scripts\") pod \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748255 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p2r8\" (UniqueName: \"kubernetes.io/projected/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-kube-api-access-2p2r8\") pod \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\" (UID: \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748308 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-horizon-secret-key\") pod \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748347 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-combined-ca-bundle\") pod \"4ef17b80-df4a-454a-9680-65dce7903b36\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748383 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-config-data\") pod \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748411 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-ovsdbserver-nb\") pod \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748432 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-combined-ca-bundle\") pod \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\" (UID: \"246d9375-0f70-4c31-ac82-63ecc1bdcd2b\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748476 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-ovsdbserver-sb\") pod \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748505 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-dns-svc\") pod \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\" (UID: \"29c6b382-3315-4904-8fd9-dc7f1c993c2b\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748525 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-credential-keys\") pod \"4ef17b80-df4a-454a-9680-65dce7903b36\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748549 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-config-data\") pod \"4ef17b80-df4a-454a-9680-65dce7903b36\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748574 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlpz9\" (UniqueName: \"kubernetes.io/projected/4ef17b80-df4a-454a-9680-65dce7903b36-kube-api-access-rlpz9\") pod \"4ef17b80-df4a-454a-9680-65dce7903b36\" (UID: \"4ef17b80-df4a-454a-9680-65dce7903b36\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.748638 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckcs2\" (UniqueName: \"kubernetes.io/projected/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-kube-api-access-ckcs2\") pod \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\" (UID: \"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702\") " Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.753265 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c6b382-3315-4904-8fd9-dc7f1c993c2b-kube-api-access-vch7q" (OuterVolumeSpecName: "kube-api-access-vch7q") pod "29c6b382-3315-4904-8fd9-dc7f1c993c2b" (UID: "29c6b382-3315-4904-8fd9-dc7f1c993c2b"). InnerVolumeSpecName "kube-api-access-vch7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.756980 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-kube-api-access-2p2r8" (OuterVolumeSpecName: "kube-api-access-2p2r8") pod "246d9375-0f70-4c31-ac82-63ecc1bdcd2b" (UID: "246d9375-0f70-4c31-ac82-63ecc1bdcd2b"). InnerVolumeSpecName "kube-api-access-2p2r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.757416 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-scripts" (OuterVolumeSpecName: "scripts") pod "9a8a341e-2b01-4c3d-b65f-5af4fa9c8702" (UID: "9a8a341e-2b01-4c3d-b65f-5af4fa9c8702"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.758339 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-kube-api-access-ckcs2" (OuterVolumeSpecName: "kube-api-access-ckcs2") pod "9a8a341e-2b01-4c3d-b65f-5af4fa9c8702" (UID: "9a8a341e-2b01-4c3d-b65f-5af4fa9c8702"). InnerVolumeSpecName "kube-api-access-ckcs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.758743 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4ef17b80-df4a-454a-9680-65dce7903b36" (UID: "4ef17b80-df4a-454a-9680-65dce7903b36"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.758952 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-logs" (OuterVolumeSpecName: "logs") pod "9a8a341e-2b01-4c3d-b65f-5af4fa9c8702" (UID: "9a8a341e-2b01-4c3d-b65f-5af4fa9c8702"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.760656 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-config-data" (OuterVolumeSpecName: "config-data") pod "9a8a341e-2b01-4c3d-b65f-5af4fa9c8702" (UID: "9a8a341e-2b01-4c3d-b65f-5af4fa9c8702"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.763830 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4ef17b80-df4a-454a-9680-65dce7903b36" (UID: "4ef17b80-df4a-454a-9680-65dce7903b36"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.769046 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-scripts" (OuterVolumeSpecName: "scripts") pod "4ef17b80-df4a-454a-9680-65dce7903b36" (UID: "4ef17b80-df4a-454a-9680-65dce7903b36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.770236 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef17b80-df4a-454a-9680-65dce7903b36-kube-api-access-rlpz9" (OuterVolumeSpecName: "kube-api-access-rlpz9") pod "4ef17b80-df4a-454a-9680-65dce7903b36" (UID: "4ef17b80-df4a-454a-9680-65dce7903b36"). InnerVolumeSpecName "kube-api-access-rlpz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.774308 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9a8a341e-2b01-4c3d-b65f-5af4fa9c8702" (UID: "9a8a341e-2b01-4c3d-b65f-5af4fa9c8702"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.815695 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-g7mg8" podUID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.851528 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.851576 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.851595 4717 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.851615 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlpz9\" (UniqueName: \"kubernetes.io/projected/4ef17b80-df4a-454a-9680-65dce7903b36-kube-api-access-rlpz9\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.851634 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckcs2\" (UniqueName: \"kubernetes.io/projected/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-kube-api-access-ckcs2\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.851650 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.851667 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vch7q\" (UniqueName: \"kubernetes.io/projected/29c6b382-3315-4904-8fd9-dc7f1c993c2b-kube-api-access-vch7q\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.851683 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.851701 4717 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.851716 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:51 crc kubenswrapper[4717]: I0217 15:11:51.851731 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p2r8\" (UniqueName: \"kubernetes.io/projected/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-kube-api-access-2p2r8\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.049687 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-config-data" (OuterVolumeSpecName: "config-data") pod "4ef17b80-df4a-454a-9680-65dce7903b36" (UID: "4ef17b80-df4a-454a-9680-65dce7903b36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.053655 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ef17b80-df4a-454a-9680-65dce7903b36" (UID: "4ef17b80-df4a-454a-9680-65dce7903b36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.058771 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.058812 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef17b80-df4a-454a-9680-65dce7903b36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.066114 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "246d9375-0f70-4c31-ac82-63ecc1bdcd2b" (UID: "246d9375-0f70-4c31-ac82-63ecc1bdcd2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.071497 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "29c6b382-3315-4904-8fd9-dc7f1c993c2b" (UID: "29c6b382-3315-4904-8fd9-dc7f1c993c2b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.073483 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-config" (OuterVolumeSpecName: "config") pod "246d9375-0f70-4c31-ac82-63ecc1bdcd2b" (UID: "246d9375-0f70-4c31-ac82-63ecc1bdcd2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.075447 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29c6b382-3315-4904-8fd9-dc7f1c993c2b" (UID: "29c6b382-3315-4904-8fd9-dc7f1c993c2b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.075621 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-config" (OuterVolumeSpecName: "config") pod "29c6b382-3315-4904-8fd9-dc7f1c993c2b" (UID: "29c6b382-3315-4904-8fd9-dc7f1c993c2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.087848 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29c6b382-3315-4904-8fd9-dc7f1c993c2b" (UID: "29c6b382-3315-4904-8fd9-dc7f1c993c2b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.143007 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-54rqc"] Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.160354 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.160739 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.160749 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.160770 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d9375-0f70-4c31-ac82-63ecc1bdcd2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.160783 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.160793 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29c6b382-3315-4904-8fd9-dc7f1c993c2b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.245620 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6496446dcc-ktb8t" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.245622 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6496446dcc-ktb8t" event={"ID":"9a8a341e-2b01-4c3d-b65f-5af4fa9c8702","Type":"ContainerDied","Data":"ea2cfae92f3b02f26ff831fdb8e1a84cd11f5a87bbba9658181ed2832a6371e9"} Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.248155 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-g7mg8" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.248154 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-g7mg8" event={"ID":"29c6b382-3315-4904-8fd9-dc7f1c993c2b","Type":"ContainerDied","Data":"11038af40f1c6abf543922a6b192b7b794561349a6e33519d7c678ffcff325ab"} Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.248231 4717 scope.go:117] "RemoveContainer" containerID="15cc8d81e9ea799a7d93cb54ecc804873c28505e89516613adffc06a18ed205f" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.252031 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8zrk2" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.252037 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8zrk2" event={"ID":"246d9375-0f70-4c31-ac82-63ecc1bdcd2b","Type":"ContainerDied","Data":"e496572e458a84d608255d35802df5adcacea22182dc87a1ea4ba0fdc882fed7"} Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.252318 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e496572e458a84d608255d35802df5adcacea22182dc87a1ea4ba0fdc882fed7" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.254610 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-92w2x" event={"ID":"4ef17b80-df4a-454a-9680-65dce7903b36","Type":"ContainerDied","Data":"e92a55c53ff88ecaaa5da3ea2ba34a277ce75be34493d74f2e7b8e0d761bc276"} Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.254647 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e92a55c53ff88ecaaa5da3ea2ba34a277ce75be34493d74f2e7b8e0d761bc276" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.254738 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-92w2x" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.274879 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-g7mg8"] Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.285605 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-g7mg8"] Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.313134 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6496446dcc-ktb8t"] Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.321402 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6496446dcc-ktb8t"] Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.668660 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-92w2x"] Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.678212 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-92w2x"] Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.772944 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gxq6g"] Feb 17 15:11:52 crc kubenswrapper[4717]: E0217 15:11:52.773298 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef17b80-df4a-454a-9680-65dce7903b36" containerName="keystone-bootstrap" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.773311 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef17b80-df4a-454a-9680-65dce7903b36" containerName="keystone-bootstrap" Feb 17 15:11:52 crc kubenswrapper[4717]: E0217 15:11:52.773329 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" containerName="dnsmasq-dns" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.773337 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" containerName="dnsmasq-dns" Feb 17 15:11:52 crc kubenswrapper[4717]: E0217 15:11:52.773355 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246d9375-0f70-4c31-ac82-63ecc1bdcd2b" containerName="neutron-db-sync" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.773361 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="246d9375-0f70-4c31-ac82-63ecc1bdcd2b" containerName="neutron-db-sync" Feb 17 15:11:52 crc kubenswrapper[4717]: E0217 15:11:52.773375 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" containerName="init" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.773381 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" containerName="init" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.773534 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef17b80-df4a-454a-9680-65dce7903b36" containerName="keystone-bootstrap" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.773549 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="246d9375-0f70-4c31-ac82-63ecc1bdcd2b" containerName="neutron-db-sync" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.773563 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" containerName="dnsmasq-dns" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.777570 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.780294 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.780473 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.783694 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.784228 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.787999 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sdkz9" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.797671 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gxq6g"] Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.822421 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzfdw\" (UniqueName: \"kubernetes.io/projected/de3759e2-5565-40bd-8b02-f3c1f0d55863-kube-api-access-gzfdw\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.822505 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-credential-keys\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.822564 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-fernet-keys\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.822599 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-combined-ca-bundle\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.822618 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-config-data\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.822634 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-scripts\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.859420 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-54rqc"] Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.896950 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fb745b69-blhdn"] Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.901434 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.912506 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-blhdn"] Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.931325 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-credential-keys\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.931413 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-fernet-keys\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.931442 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-dns-svc\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.931467 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kblvc\" (UniqueName: \"kubernetes.io/projected/d5afdf52-2ad7-4957-8bff-ca95fee13432-kube-api-access-kblvc\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.931491 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.931513 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-combined-ca-bundle\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.931533 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-config-data\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.931549 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-scripts\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.931592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzfdw\" (UniqueName: \"kubernetes.io/projected/de3759e2-5565-40bd-8b02-f3c1f0d55863-kube-api-access-gzfdw\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.931622 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-config\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.931651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.936192 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-scripts\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.937659 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-fernet-keys\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.938299 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-config-data\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.943562 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-combined-ca-bundle\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.946632 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-credential-keys\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:52 crc kubenswrapper[4717]: I0217 15:11:52.963785 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzfdw\" (UniqueName: \"kubernetes.io/projected/de3759e2-5565-40bd-8b02-f3c1f0d55863-kube-api-access-gzfdw\") pod \"keystone-bootstrap-gxq6g\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.020354 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d69c5bc8d-pt8k6"] Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.022196 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.025132 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.025259 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.025256 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-25rgb" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.025488 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.034381 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-config\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.034440 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.034544 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-dns-svc\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.034572 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kblvc\" (UniqueName: \"kubernetes.io/projected/d5afdf52-2ad7-4957-8bff-ca95fee13432-kube-api-access-kblvc\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.034589 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.035377 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.035389 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-config\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.035957 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-dns-svc\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.036189 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.037175 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d69c5bc8d-pt8k6"] Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.053749 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kblvc\" (UniqueName: \"kubernetes.io/projected/d5afdf52-2ad7-4957-8bff-ca95fee13432-kube-api-access-kblvc\") pod \"dnsmasq-dns-fb745b69-blhdn\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.097211 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.138545 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-ovndb-tls-certs\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.138607 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cbtd\" (UniqueName: \"kubernetes.io/projected/a36880ff-5e50-413f-8213-79ed16bed713-kube-api-access-5cbtd\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.138650 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-config\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.138680 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-combined-ca-bundle\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.138724 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-httpd-config\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.217749 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.240593 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-ovndb-tls-certs\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.241212 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cbtd\" (UniqueName: \"kubernetes.io/projected/a36880ff-5e50-413f-8213-79ed16bed713-kube-api-access-5cbtd\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.241254 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-config\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.241284 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-combined-ca-bundle\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.241966 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-httpd-config\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.246983 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-config\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.247757 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-combined-ca-bundle\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.248329 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-ovndb-tls-certs\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.248824 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-httpd-config\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.259067 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cbtd\" (UniqueName: \"kubernetes.io/projected/a36880ff-5e50-413f-8213-79ed16bed713-kube-api-access-5cbtd\") pod \"neutron-6d69c5bc8d-pt8k6\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.344893 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:53 crc kubenswrapper[4717]: E0217 15:11:53.489817 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 17 15:11:53 crc kubenswrapper[4717]: E0217 15:11:53.489961 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2klgj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-s5lzg_openstack(b9c9596c-6124-44d0-b06b-a99477938b79): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:11:53 crc kubenswrapper[4717]: E0217 15:11:53.491148 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-s5lzg" podUID="b9c9596c-6124-44d0-b06b-a99477938b79" Feb 17 15:11:53 crc kubenswrapper[4717]: E0217 15:11:53.730356 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified" Feb 17 15:11:53 crc kubenswrapper[4717]: E0217 15:11:53.730517 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n678h564h644hcch5fbh9ch667h656h658h55ch64bh544h5bh676h8bh597h5d6h68fh65dhd5hc9h5bch676h649h586h5fdh6dh6dhcch594h574h5fdq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v6q6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.824242 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.826950 4717 scope.go:117] "RemoveContainer" containerID="7ec9aca08d8dd4a2a00430f320c52fee44835265cc02485a134fda7634f93c2e" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.829837 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.854146 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-scripts\") pod \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.854436 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg9qt\" (UniqueName: \"kubernetes.io/projected/0ad0e548-d0dc-452c-b749-713a58712d11-kube-api-access-cg9qt\") pod \"0ad0e548-d0dc-452c-b749-713a58712d11\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.854486 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ad0e548-d0dc-452c-b749-713a58712d11-config-data\") pod \"0ad0e548-d0dc-452c-b749-713a58712d11\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.854502 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-config-data\") pod \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.854903 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-scripts" (OuterVolumeSpecName: "scripts") pod "abe2c2e7-c33b-44bf-9b73-7bf824f98c2b" (UID: "abe2c2e7-c33b-44bf-9b73-7bf824f98c2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.855012 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad0e548-d0dc-452c-b749-713a58712d11-config-data" (OuterVolumeSpecName: "config-data") pod "0ad0e548-d0dc-452c-b749-713a58712d11" (UID: "0ad0e548-d0dc-452c-b749-713a58712d11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.855035 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ad0e548-d0dc-452c-b749-713a58712d11-scripts\") pod \"0ad0e548-d0dc-452c-b749-713a58712d11\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.855067 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqj5m\" (UniqueName: \"kubernetes.io/projected/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-kube-api-access-zqj5m\") pod \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.855325 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad0e548-d0dc-452c-b749-713a58712d11-scripts" (OuterVolumeSpecName: "scripts") pod "0ad0e548-d0dc-452c-b749-713a58712d11" (UID: "0ad0e548-d0dc-452c-b749-713a58712d11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.855366 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-logs\") pod \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.855422 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ad0e548-d0dc-452c-b749-713a58712d11-horizon-secret-key\") pod \"0ad0e548-d0dc-452c-b749-713a58712d11\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.855184 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-config-data" (OuterVolumeSpecName: "config-data") pod "abe2c2e7-c33b-44bf-9b73-7bf824f98c2b" (UID: "abe2c2e7-c33b-44bf-9b73-7bf824f98c2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.855977 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-logs" (OuterVolumeSpecName: "logs") pod "abe2c2e7-c33b-44bf-9b73-7bf824f98c2b" (UID: "abe2c2e7-c33b-44bf-9b73-7bf824f98c2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.856027 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-horizon-secret-key\") pod \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\" (UID: \"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b\") " Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.856122 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad0e548-d0dc-452c-b749-713a58712d11-logs\") pod \"0ad0e548-d0dc-452c-b749-713a58712d11\" (UID: \"0ad0e548-d0dc-452c-b749-713a58712d11\") " Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.856614 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ad0e548-d0dc-452c-b749-713a58712d11-logs" (OuterVolumeSpecName: "logs") pod "0ad0e548-d0dc-452c-b749-713a58712d11" (UID: "0ad0e548-d0dc-452c-b749-713a58712d11"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.856898 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad0e548-d0dc-452c-b749-713a58712d11-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.856910 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.856918 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.856926 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ad0e548-d0dc-452c-b749-713a58712d11-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.856933 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ad0e548-d0dc-452c-b749-713a58712d11-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.856941 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.861108 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad0e548-d0dc-452c-b749-713a58712d11-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0ad0e548-d0dc-452c-b749-713a58712d11" (UID: "0ad0e548-d0dc-452c-b749-713a58712d11"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.864066 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "abe2c2e7-c33b-44bf-9b73-7bf824f98c2b" (UID: "abe2c2e7-c33b-44bf-9b73-7bf824f98c2b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.867491 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c6b382-3315-4904-8fd9-dc7f1c993c2b" path="/var/lib/kubelet/pods/29c6b382-3315-4904-8fd9-dc7f1c993c2b/volumes" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.868372 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef17b80-df4a-454a-9680-65dce7903b36" path="/var/lib/kubelet/pods/4ef17b80-df4a-454a-9680-65dce7903b36/volumes" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.868926 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a8a341e-2b01-4c3d-b65f-5af4fa9c8702" path="/var/lib/kubelet/pods/9a8a341e-2b01-4c3d-b65f-5af4fa9c8702/volumes" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.875392 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ad0e548-d0dc-452c-b749-713a58712d11-kube-api-access-cg9qt" (OuterVolumeSpecName: "kube-api-access-cg9qt") pod "0ad0e548-d0dc-452c-b749-713a58712d11" (UID: "0ad0e548-d0dc-452c-b749-713a58712d11"). InnerVolumeSpecName "kube-api-access-cg9qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.875385 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-kube-api-access-zqj5m" (OuterVolumeSpecName: "kube-api-access-zqj5m") pod "abe2c2e7-c33b-44bf-9b73-7bf824f98c2b" (UID: "abe2c2e7-c33b-44bf-9b73-7bf824f98c2b"). InnerVolumeSpecName "kube-api-access-zqj5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.958496 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ad0e548-d0dc-452c-b749-713a58712d11-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.958526 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.958538 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg9qt\" (UniqueName: \"kubernetes.io/projected/0ad0e548-d0dc-452c-b749-713a58712d11-kube-api-access-cg9qt\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:53 crc kubenswrapper[4717]: I0217 15:11:53.958550 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqj5m\" (UniqueName: \"kubernetes.io/projected/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b-kube-api-access-zqj5m\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.281059 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6ncb2" event={"ID":"de8aa40e-4ead-46b8-a94b-0a3602d030ef","Type":"ContainerStarted","Data":"649701b8f1b2cab221feb3d8ed2d837aee0b938775799a8853384119a3b8a680"} Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.282925 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-894985b7-dwvdw" event={"ID":"0ad0e548-d0dc-452c-b749-713a58712d11","Type":"ContainerDied","Data":"69a4b485011444c5052601b81117165df4203e6d631eeee4943a706686cadcd1"} Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.283012 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-894985b7-dwvdw" Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.311878 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fg4vl" event={"ID":"7055a012-2f5d-4ba2-b56d-a9ec73e11944","Type":"ContainerStarted","Data":"967062522ba9d55030f87bbfbcaf4f9488d1ddd65c5f0052dcc040e5f828b911"} Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.327379 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cf41b7d-00cb-4bde-bfc5-05f1733d6b77" containerID="b55649045e78ea048dc6dc5493488562b0aecd3af115733de5148fff3c505094" exitCode=0 Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.327582 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-54rqc" event={"ID":"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77","Type":"ContainerDied","Data":"b55649045e78ea048dc6dc5493488562b0aecd3af115733de5148fff3c505094"} Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.327606 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-54rqc" event={"ID":"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77","Type":"ContainerStarted","Data":"58d20c34ca80150451c3a249c3158df32e3d5d22de9178de60b9b2f3333437c0"} Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.330865 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-6ncb2" podStartSLOduration=1.923246893 podStartE2EDuration="38.330844256s" podCreationTimestamp="2026-02-17 15:11:16 +0000 UTC" firstStartedPulling="2026-02-17 15:11:17.494232329 +0000 UTC m=+1143.910072805" lastFinishedPulling="2026-02-17 15:11:53.901829692 +0000 UTC m=+1180.317670168" observedRunningTime="2026-02-17 15:11:54.304076127 +0000 UTC m=+1180.719916613" watchObservedRunningTime="2026-02-17 15:11:54.330844256 +0000 UTC m=+1180.746684732" Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.338466 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b6b86dfcc-ptxwk" Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.338514 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b6b86dfcc-ptxwk" event={"ID":"abe2c2e7-c33b-44bf-9b73-7bf824f98c2b","Type":"ContainerDied","Data":"4533decd9eb40118d0bcc085cfce679bf43cd04f28ce586c0418f71046e9474c"} Feb 17 15:11:54 crc kubenswrapper[4717]: E0217 15:11:54.379807 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-s5lzg" podUID="b9c9596c-6124-44d0-b06b-a99477938b79" Feb 17 15:11:54 crc kubenswrapper[4717]: E0217 15:11:54.511123 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ad0e548_d0dc_452c_b749_713a58712d11.slice/crio-69a4b485011444c5052601b81117165df4203e6d631eeee4943a706686cadcd1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ad0e548_d0dc_452c_b749_713a58712d11.slice\": RecentStats: unable to find data in memory cache]" Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.718117 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b88fd5cc6-dqjmc"] Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.728730 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fg4vl" podStartSLOduration=2.5349239150000002 podStartE2EDuration="38.728702637s" podCreationTimestamp="2026-02-17 15:11:16 +0000 UTC" firstStartedPulling="2026-02-17 15:11:17.676242818 +0000 UTC m=+1144.092083294" lastFinishedPulling="2026-02-17 15:11:53.87002154 +0000 UTC m=+1180.285862016" observedRunningTime="2026-02-17 15:11:54.693985853 +0000 UTC m=+1181.109826339" watchObservedRunningTime="2026-02-17 15:11:54.728702637 +0000 UTC m=+1181.144543113" Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.817913 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-894985b7-dwvdw"] Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.824433 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-894985b7-dwvdw"] Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.855447 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b6b86dfcc-ptxwk"] Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.901019 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85b46995b-rj5bq"] Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.924712 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b6b86dfcc-ptxwk"] Feb 17 15:11:54 crc kubenswrapper[4717]: I0217 15:11:54.976762 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.080110 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.264628 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-blhdn"] Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.270755 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gxq6g"] Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.371913 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"216eede9ee0ae7771168b533010f1d8a12d39d6ae0fb0f340e0addfc8fc88bbf"} Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.374231 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b88fd5cc6-dqjmc" event={"ID":"946c8c31-01d1-45f7-87c2-a022100aeef9","Type":"ContainerStarted","Data":"8622e1417c766e9d3f34c004d62d9076982ac31c050af9355bb76223d14efdaf"} Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.374780 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.376480 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"619aaea4-acc6-488e-bb45-6406861b48db","Type":"ContainerStarted","Data":"502eb9b2427797dc47ca0150db7e3dd66816a58aefd4538f6e6e11bc21074650"} Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.381231 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-blhdn" event={"ID":"d5afdf52-2ad7-4957-8bff-ca95fee13432","Type":"ContainerStarted","Data":"aef90f1a9830717ce75a35d71555ca8b2057e84b0878bd8c4898dd3d973f4778"} Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.384209 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85b46995b-rj5bq" event={"ID":"04bb64de-6640-4f6a-9052-ff0edf9dacb8","Type":"ContainerStarted","Data":"ea5a6696b8c9b2334ea6e7571114452281a721fe41c8015001d374ddb6afbc08"} Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.386495 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gxq6g" event={"ID":"de3759e2-5565-40bd-8b02-f3c1f0d55863","Type":"ContainerStarted","Data":"44c38b1174eb9c38869dba5643b9531bc72a5d3852a2b927ec79a12eee867a9d"} Feb 17 15:11:55 crc kubenswrapper[4717]: W0217 15:11:55.396666 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode20b1f7a_2627_4bde_9e82_d458ed9ba797.slice/crio-4cc357a1c47bf0d4b2978f8b262c858ec9a8694e0e4f8aab2e860013fd6ce8a9 WatchSource:0}: Error finding container 4cc357a1c47bf0d4b2978f8b262c858ec9a8694e0e4f8aab2e860013fd6ce8a9: Status 404 returned error can't find the container with id 4cc357a1c47bf0d4b2978f8b262c858ec9a8694e0e4f8aab2e860013fd6ce8a9 Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.403169 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.484984 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d69c5bc8d-pt8k6"] Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.501640 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-config\") pod \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.501725 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-ovsdbserver-nb\") pod \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.501844 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb9wk\" (UniqueName: \"kubernetes.io/projected/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-kube-api-access-cb9wk\") pod \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.501923 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-dns-svc\") pod \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.502014 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-ovsdbserver-sb\") pod \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\" (UID: \"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77\") " Feb 17 15:11:55 crc kubenswrapper[4717]: W0217 15:11:55.503895 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda36880ff_5e50_413f_8213_79ed16bed713.slice/crio-3c70c371a8e8e6829c87476fdc229459c4c1ce572d323d5ad85cf70bf024746e WatchSource:0}: Error finding container 3c70c371a8e8e6829c87476fdc229459c4c1ce572d323d5ad85cf70bf024746e: Status 404 returned error can't find the container with id 3c70c371a8e8e6829c87476fdc229459c4c1ce572d323d5ad85cf70bf024746e Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.511428 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-kube-api-access-cb9wk" (OuterVolumeSpecName: "kube-api-access-cb9wk") pod "7cf41b7d-00cb-4bde-bfc5-05f1733d6b77" (UID: "7cf41b7d-00cb-4bde-bfc5-05f1733d6b77"). InnerVolumeSpecName "kube-api-access-cb9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.527598 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7cf41b7d-00cb-4bde-bfc5-05f1733d6b77" (UID: "7cf41b7d-00cb-4bde-bfc5-05f1733d6b77"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.533010 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7cf41b7d-00cb-4bde-bfc5-05f1733d6b77" (UID: "7cf41b7d-00cb-4bde-bfc5-05f1733d6b77"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.545257 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-config" (OuterVolumeSpecName: "config") pod "7cf41b7d-00cb-4bde-bfc5-05f1733d6b77" (UID: "7cf41b7d-00cb-4bde-bfc5-05f1733d6b77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.587774 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7cf41b7d-00cb-4bde-bfc5-05f1733d6b77" (UID: "7cf41b7d-00cb-4bde-bfc5-05f1733d6b77"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.609624 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb9wk\" (UniqueName: \"kubernetes.io/projected/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-kube-api-access-cb9wk\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.609669 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.609682 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.609695 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.609706 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.630171 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c77f44b65-wwnph"] Feb 17 15:11:55 crc kubenswrapper[4717]: E0217 15:11:55.633161 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf41b7d-00cb-4bde-bfc5-05f1733d6b77" containerName="init" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.633203 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf41b7d-00cb-4bde-bfc5-05f1733d6b77" containerName="init" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.633472 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf41b7d-00cb-4bde-bfc5-05f1733d6b77" containerName="init" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.634601 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.637119 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.649443 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.653214 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c77f44b65-wwnph"] Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.717475 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-httpd-config\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.717519 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-combined-ca-bundle\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.717558 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-ovndb-tls-certs\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.717615 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-public-tls-certs\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.717648 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-internal-tls-certs\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.717684 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-config\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.717699 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phd9g\" (UniqueName: \"kubernetes.io/projected/4e706234-18f2-467c-8681-79402d25eb1c-kube-api-access-phd9g\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.824911 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-httpd-config\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.824962 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-combined-ca-bundle\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.825009 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-ovndb-tls-certs\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.825072 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-public-tls-certs\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.825123 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-internal-tls-certs\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.825163 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-config\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.825184 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phd9g\" (UniqueName: \"kubernetes.io/projected/4e706234-18f2-467c-8681-79402d25eb1c-kube-api-access-phd9g\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.836394 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-httpd-config\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.843711 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-internal-tls-certs\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.844020 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-public-tls-certs\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.844950 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-config\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.846010 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-combined-ca-bundle\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.858796 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-ovndb-tls-certs\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.859331 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phd9g\" (UniqueName: \"kubernetes.io/projected/4e706234-18f2-467c-8681-79402d25eb1c-kube-api-access-phd9g\") pod \"neutron-c77f44b65-wwnph\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.924951 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ad0e548-d0dc-452c-b749-713a58712d11" path="/var/lib/kubelet/pods/0ad0e548-d0dc-452c-b749-713a58712d11/volumes" Feb 17 15:11:55 crc kubenswrapper[4717]: I0217 15:11:55.925397 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe2c2e7-c33b-44bf-9b73-7bf824f98c2b" path="/var/lib/kubelet/pods/abe2c2e7-c33b-44bf-9b73-7bf824f98c2b/volumes" Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.131973 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.411015 4717 generic.go:334] "Generic (PLEG): container finished" podID="d5afdf52-2ad7-4957-8bff-ca95fee13432" containerID="22b85e01f2962e3483aedf7b29a00fb5a65948fcf4751ac2aab5382a8d9c3b96" exitCode=0 Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.411094 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-blhdn" event={"ID":"d5afdf52-2ad7-4957-8bff-ca95fee13432","Type":"ContainerDied","Data":"22b85e01f2962e3483aedf7b29a00fb5a65948fcf4751ac2aab5382a8d9c3b96"} Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.423855 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85b46995b-rj5bq" event={"ID":"04bb64de-6640-4f6a-9052-ff0edf9dacb8","Type":"ContainerStarted","Data":"58c925a36e70b46ac5dcea34ae7998bc472e09c423fa1ff18f1509a3af02d074"} Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.424197 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85b46995b-rj5bq" event={"ID":"04bb64de-6640-4f6a-9052-ff0edf9dacb8","Type":"ContainerStarted","Data":"7e42544b93c1d71b64867ab95a9cd995790f4b02874b183b4c1d7f1668a5fc20"} Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.479609 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-85b46995b-rj5bq" podStartSLOduration=31.870546962 podStartE2EDuration="32.47958851s" podCreationTimestamp="2026-02-17 15:11:24 +0000 UTC" firstStartedPulling="2026-02-17 15:11:54.851925721 +0000 UTC m=+1181.267766197" lastFinishedPulling="2026-02-17 15:11:55.460967269 +0000 UTC m=+1181.876807745" observedRunningTime="2026-02-17 15:11:56.479435116 +0000 UTC m=+1182.895275612" watchObservedRunningTime="2026-02-17 15:11:56.47958851 +0000 UTC m=+1182.895428986" Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.487603 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e20b1f7a-2627-4bde-9e82-d458ed9ba797","Type":"ContainerStarted","Data":"4cc357a1c47bf0d4b2978f8b262c858ec9a8694e0e4f8aab2e860013fd6ce8a9"} Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.519153 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-54rqc" event={"ID":"7cf41b7d-00cb-4bde-bfc5-05f1733d6b77","Type":"ContainerDied","Data":"58d20c34ca80150451c3a249c3158df32e3d5d22de9178de60b9b2f3333437c0"} Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.519232 4717 scope.go:117] "RemoveContainer" containerID="b55649045e78ea048dc6dc5493488562b0aecd3af115733de5148fff3c505094" Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.519361 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-54rqc" Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.546926 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d69c5bc8d-pt8k6" event={"ID":"a36880ff-5e50-413f-8213-79ed16bed713","Type":"ContainerStarted","Data":"5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277"} Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.546973 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d69c5bc8d-pt8k6" event={"ID":"a36880ff-5e50-413f-8213-79ed16bed713","Type":"ContainerStarted","Data":"3c70c371a8e8e6829c87476fdc229459c4c1ce572d323d5ad85cf70bf024746e"} Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.571834 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gxq6g" event={"ID":"de3759e2-5565-40bd-8b02-f3c1f0d55863","Type":"ContainerStarted","Data":"64148e567539ee53dc679886de02850eb725d2b861ae53a9715e8804030baf57"} Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.604534 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b88fd5cc6-dqjmc" event={"ID":"946c8c31-01d1-45f7-87c2-a022100aeef9","Type":"ContainerStarted","Data":"de13a9e87b8b1b1396a6b56df5eb0db2cb23cd17ad0fd1fa2115022517f5d512"} Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.604579 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b88fd5cc6-dqjmc" event={"ID":"946c8c31-01d1-45f7-87c2-a022100aeef9","Type":"ContainerStarted","Data":"d8519b0f4b147248ba4d6f91a993baf664f8ea8a630bc4159f522032d1ffea67"} Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.615449 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gxq6g" podStartSLOduration=4.615426091 podStartE2EDuration="4.615426091s" podCreationTimestamp="2026-02-17 15:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:11:56.609680009 +0000 UTC m=+1183.025520505" watchObservedRunningTime="2026-02-17 15:11:56.615426091 +0000 UTC m=+1183.031266567" Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.626279 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"619aaea4-acc6-488e-bb45-6406861b48db","Type":"ContainerStarted","Data":"ead0316cb1f5db1f2740a7d328cde5d850db4180c5aa4bd15de1f4fd529db1a2"} Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.644865 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b88fd5cc6-dqjmc" podStartSLOduration=32.19037405 podStartE2EDuration="32.644847256s" podCreationTimestamp="2026-02-17 15:11:24 +0000 UTC" firstStartedPulling="2026-02-17 15:11:54.732287869 +0000 UTC m=+1181.148128345" lastFinishedPulling="2026-02-17 15:11:55.186761055 +0000 UTC m=+1181.602601551" observedRunningTime="2026-02-17 15:11:56.643922769 +0000 UTC m=+1183.059763255" watchObservedRunningTime="2026-02-17 15:11:56.644847256 +0000 UTC m=+1183.060687732" Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.768306 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-54rqc"] Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.780498 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-54rqc"] Feb 17 15:11:56 crc kubenswrapper[4717]: I0217 15:11:56.892386 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c77f44b65-wwnph"] Feb 17 15:11:56 crc kubenswrapper[4717]: W0217 15:11:56.896177 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e706234_18f2_467c_8681_79402d25eb1c.slice/crio-d438099255207b27838d10495236229940290387c1284e06007791d59263513a WatchSource:0}: Error finding container d438099255207b27838d10495236229940290387c1284e06007791d59263513a: Status 404 returned error can't find the container with id d438099255207b27838d10495236229940290387c1284e06007791d59263513a Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.656028 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c77f44b65-wwnph" event={"ID":"4e706234-18f2-467c-8681-79402d25eb1c","Type":"ContainerStarted","Data":"9b0e8901f70ef1717d2e1dc0e35c08ede9faad7674cd112478e5224ef0aef980"} Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.656357 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c77f44b65-wwnph" event={"ID":"4e706234-18f2-467c-8681-79402d25eb1c","Type":"ContainerStarted","Data":"d438099255207b27838d10495236229940290387c1284e06007791d59263513a"} Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.668208 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d69c5bc8d-pt8k6" event={"ID":"a36880ff-5e50-413f-8213-79ed16bed713","Type":"ContainerStarted","Data":"950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e"} Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.670196 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.681504 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"619aaea4-acc6-488e-bb45-6406861b48db","Type":"ContainerStarted","Data":"031d430a6684efff0bdc84d8bef990ee658c3d163c97d882c2db0509594f01f8"} Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.682001 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="619aaea4-acc6-488e-bb45-6406861b48db" containerName="glance-log" containerID="cri-o://ead0316cb1f5db1f2740a7d328cde5d850db4180c5aa4bd15de1f4fd529db1a2" gracePeriod=30 Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.682511 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="619aaea4-acc6-488e-bb45-6406861b48db" containerName="glance-httpd" containerID="cri-o://031d430a6684efff0bdc84d8bef990ee658c3d163c97d882c2db0509594f01f8" gracePeriod=30 Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.695844 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d69c5bc8d-pt8k6" podStartSLOduration=5.695823655 podStartE2EDuration="5.695823655s" podCreationTimestamp="2026-02-17 15:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:11:57.692904602 +0000 UTC m=+1184.108745088" watchObservedRunningTime="2026-02-17 15:11:57.695823655 +0000 UTC m=+1184.111664141" Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.708789 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-blhdn" event={"ID":"d5afdf52-2ad7-4957-8bff-ca95fee13432","Type":"ContainerStarted","Data":"915b734f054a00c83e2030292e96a88928e5a19cd6fb858daa06eab626edeb61"} Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.717761 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.731201 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=23.731179507 podStartE2EDuration="23.731179507s" podCreationTimestamp="2026-02-17 15:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:11:57.719799794 +0000 UTC m=+1184.135640280" watchObservedRunningTime="2026-02-17 15:11:57.731179507 +0000 UTC m=+1184.147019983" Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.736194 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e20b1f7a-2627-4bde-9e82-d458ed9ba797","Type":"ContainerStarted","Data":"0d1f0ef8d7b9c63b8f044e715fb953a819d1ef8576839e32a5ec268a89c1029b"} Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.770278 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fb745b69-blhdn" podStartSLOduration=5.770247715 podStartE2EDuration="5.770247715s" podCreationTimestamp="2026-02-17 15:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:11:57.758836801 +0000 UTC m=+1184.174677287" watchObservedRunningTime="2026-02-17 15:11:57.770247715 +0000 UTC m=+1184.186088191" Feb 17 15:11:57 crc kubenswrapper[4717]: I0217 15:11:57.858525 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf41b7d-00cb-4bde-bfc5-05f1733d6b77" path="/var/lib/kubelet/pods/7cf41b7d-00cb-4bde-bfc5-05f1733d6b77/volumes" Feb 17 15:11:58 crc kubenswrapper[4717]: I0217 15:11:58.752944 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e20b1f7a-2627-4bde-9e82-d458ed9ba797","Type":"ContainerStarted","Data":"9fbdb66667073249b7bc449195c2e674fbfa11d81da4690ca02c5db423a1f344"} Feb 17 15:11:58 crc kubenswrapper[4717]: I0217 15:11:58.753640 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e20b1f7a-2627-4bde-9e82-d458ed9ba797" containerName="glance-log" containerID="cri-o://0d1f0ef8d7b9c63b8f044e715fb953a819d1ef8576839e32a5ec268a89c1029b" gracePeriod=30 Feb 17 15:11:58 crc kubenswrapper[4717]: I0217 15:11:58.754303 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e20b1f7a-2627-4bde-9e82-d458ed9ba797" containerName="glance-httpd" containerID="cri-o://9fbdb66667073249b7bc449195c2e674fbfa11d81da4690ca02c5db423a1f344" gracePeriod=30 Feb 17 15:11:58 crc kubenswrapper[4717]: I0217 15:11:58.765524 4717 generic.go:334] "Generic (PLEG): container finished" podID="619aaea4-acc6-488e-bb45-6406861b48db" containerID="031d430a6684efff0bdc84d8bef990ee658c3d163c97d882c2db0509594f01f8" exitCode=143 Feb 17 15:11:58 crc kubenswrapper[4717]: I0217 15:11:58.765550 4717 generic.go:334] "Generic (PLEG): container finished" podID="619aaea4-acc6-488e-bb45-6406861b48db" containerID="ead0316cb1f5db1f2740a7d328cde5d850db4180c5aa4bd15de1f4fd529db1a2" exitCode=143 Feb 17 15:11:58 crc kubenswrapper[4717]: I0217 15:11:58.766391 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"619aaea4-acc6-488e-bb45-6406861b48db","Type":"ContainerDied","Data":"031d430a6684efff0bdc84d8bef990ee658c3d163c97d882c2db0509594f01f8"} Feb 17 15:11:58 crc kubenswrapper[4717]: I0217 15:11:58.766417 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"619aaea4-acc6-488e-bb45-6406861b48db","Type":"ContainerDied","Data":"ead0316cb1f5db1f2740a7d328cde5d850db4180c5aa4bd15de1f4fd529db1a2"} Feb 17 15:11:58 crc kubenswrapper[4717]: I0217 15:11:58.790439 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=24.79041188 podStartE2EDuration="24.79041188s" podCreationTimestamp="2026-02-17 15:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:11:58.782882446 +0000 UTC m=+1185.198722922" watchObservedRunningTime="2026-02-17 15:11:58.79041188 +0000 UTC m=+1185.206252356" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.710862 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.735218 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/619aaea4-acc6-488e-bb45-6406861b48db-logs\") pod \"619aaea4-acc6-488e-bb45-6406861b48db\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.735354 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"619aaea4-acc6-488e-bb45-6406861b48db\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.735415 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-config-data\") pod \"619aaea4-acc6-488e-bb45-6406861b48db\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.735438 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gknvc\" (UniqueName: \"kubernetes.io/projected/619aaea4-acc6-488e-bb45-6406861b48db-kube-api-access-gknvc\") pod \"619aaea4-acc6-488e-bb45-6406861b48db\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.735507 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-combined-ca-bundle\") pod \"619aaea4-acc6-488e-bb45-6406861b48db\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.735597 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/619aaea4-acc6-488e-bb45-6406861b48db-httpd-run\") pod \"619aaea4-acc6-488e-bb45-6406861b48db\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.735631 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-scripts\") pod \"619aaea4-acc6-488e-bb45-6406861b48db\" (UID: \"619aaea4-acc6-488e-bb45-6406861b48db\") " Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.735666 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/619aaea4-acc6-488e-bb45-6406861b48db-logs" (OuterVolumeSpecName: "logs") pod "619aaea4-acc6-488e-bb45-6406861b48db" (UID: "619aaea4-acc6-488e-bb45-6406861b48db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.736516 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/619aaea4-acc6-488e-bb45-6406861b48db-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.736685 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/619aaea4-acc6-488e-bb45-6406861b48db-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "619aaea4-acc6-488e-bb45-6406861b48db" (UID: "619aaea4-acc6-488e-bb45-6406861b48db"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.743369 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "619aaea4-acc6-488e-bb45-6406861b48db" (UID: "619aaea4-acc6-488e-bb45-6406861b48db"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.750428 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619aaea4-acc6-488e-bb45-6406861b48db-kube-api-access-gknvc" (OuterVolumeSpecName: "kube-api-access-gknvc") pod "619aaea4-acc6-488e-bb45-6406861b48db" (UID: "619aaea4-acc6-488e-bb45-6406861b48db"). InnerVolumeSpecName "kube-api-access-gknvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.755095 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-scripts" (OuterVolumeSpecName: "scripts") pod "619aaea4-acc6-488e-bb45-6406861b48db" (UID: "619aaea4-acc6-488e-bb45-6406861b48db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.815420 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"619aaea4-acc6-488e-bb45-6406861b48db","Type":"ContainerDied","Data":"502eb9b2427797dc47ca0150db7e3dd66816a58aefd4538f6e6e11bc21074650"} Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.815482 4717 scope.go:117] "RemoveContainer" containerID="031d430a6684efff0bdc84d8bef990ee658c3d163c97d882c2db0509594f01f8" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.815629 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.819027 4717 generic.go:334] "Generic (PLEG): container finished" podID="e20b1f7a-2627-4bde-9e82-d458ed9ba797" containerID="9fbdb66667073249b7bc449195c2e674fbfa11d81da4690ca02c5db423a1f344" exitCode=0 Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.819052 4717 generic.go:334] "Generic (PLEG): container finished" podID="e20b1f7a-2627-4bde-9e82-d458ed9ba797" containerID="0d1f0ef8d7b9c63b8f044e715fb953a819d1ef8576839e32a5ec268a89c1029b" exitCode=143 Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.819394 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e20b1f7a-2627-4bde-9e82-d458ed9ba797","Type":"ContainerDied","Data":"9fbdb66667073249b7bc449195c2e674fbfa11d81da4690ca02c5db423a1f344"} Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.819466 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e20b1f7a-2627-4bde-9e82-d458ed9ba797","Type":"ContainerDied","Data":"0d1f0ef8d7b9c63b8f044e715fb953a819d1ef8576839e32a5ec268a89c1029b"} Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.831967 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "619aaea4-acc6-488e-bb45-6406861b48db" (UID: "619aaea4-acc6-488e-bb45-6406861b48db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.838936 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.838963 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gknvc\" (UniqueName: \"kubernetes.io/projected/619aaea4-acc6-488e-bb45-6406861b48db-kube-api-access-gknvc\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.838976 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.838987 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/619aaea4-acc6-488e-bb45-6406861b48db-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.838997 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.873339 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.874903 4717 scope.go:117] "RemoveContainer" containerID="ead0316cb1f5db1f2740a7d328cde5d850db4180c5aa4bd15de1f4fd529db1a2" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.885868 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-config-data" (OuterVolumeSpecName: "config-data") pod "619aaea4-acc6-488e-bb45-6406861b48db" (UID: "619aaea4-acc6-488e-bb45-6406861b48db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.944512 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:11:59 crc kubenswrapper[4717]: I0217 15:11:59.944550 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619aaea4-acc6-488e-bb45-6406861b48db-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.180165 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.196704 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.218323 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:12:00 crc kubenswrapper[4717]: E0217 15:12:00.218726 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619aaea4-acc6-488e-bb45-6406861b48db" containerName="glance-httpd" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.218739 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="619aaea4-acc6-488e-bb45-6406861b48db" containerName="glance-httpd" Feb 17 15:12:00 crc kubenswrapper[4717]: E0217 15:12:00.218769 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619aaea4-acc6-488e-bb45-6406861b48db" containerName="glance-log" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.218776 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="619aaea4-acc6-488e-bb45-6406861b48db" containerName="glance-log" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.218970 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="619aaea4-acc6-488e-bb45-6406861b48db" containerName="glance-log" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.218998 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="619aaea4-acc6-488e-bb45-6406861b48db" containerName="glance-httpd" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.231256 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.241949 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.246693 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.246908 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.353374 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.353419 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90dd9df8-232b-4a2c-a750-4ad7209404b3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.353592 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.353659 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90dd9df8-232b-4a2c-a750-4ad7209404b3-logs\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.353753 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.353895 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9pc\" (UniqueName: \"kubernetes.io/projected/90dd9df8-232b-4a2c-a750-4ad7209404b3-kube-api-access-vq9pc\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.353945 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.353998 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.354249 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.455455 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e20b1f7a-2627-4bde-9e82-d458ed9ba797-httpd-run\") pod \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.455588 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e20b1f7a-2627-4bde-9e82-d458ed9ba797-logs\") pod \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.455613 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.455644 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gfvl\" (UniqueName: \"kubernetes.io/projected/e20b1f7a-2627-4bde-9e82-d458ed9ba797-kube-api-access-4gfvl\") pod \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.455695 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-config-data\") pod \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.455737 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-scripts\") pod \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.455761 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-combined-ca-bundle\") pod \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\" (UID: \"e20b1f7a-2627-4bde-9e82-d458ed9ba797\") " Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.456171 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.456196 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90dd9df8-232b-4a2c-a750-4ad7209404b3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.456240 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.456262 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90dd9df8-232b-4a2c-a750-4ad7209404b3-logs\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.456291 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.456328 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9pc\" (UniqueName: \"kubernetes.io/projected/90dd9df8-232b-4a2c-a750-4ad7209404b3-kube-api-access-vq9pc\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.456348 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.456375 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.456679 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.458542 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90dd9df8-232b-4a2c-a750-4ad7209404b3-logs\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.460467 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e20b1f7a-2627-4bde-9e82-d458ed9ba797-logs" (OuterVolumeSpecName: "logs") pod "e20b1f7a-2627-4bde-9e82-d458ed9ba797" (UID: "e20b1f7a-2627-4bde-9e82-d458ed9ba797"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.460993 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e20b1f7a-2627-4bde-9e82-d458ed9ba797-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e20b1f7a-2627-4bde-9e82-d458ed9ba797" (UID: "e20b1f7a-2627-4bde-9e82-d458ed9ba797"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.461815 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90dd9df8-232b-4a2c-a750-4ad7209404b3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.468333 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-scripts" (OuterVolumeSpecName: "scripts") pod "e20b1f7a-2627-4bde-9e82-d458ed9ba797" (UID: "e20b1f7a-2627-4bde-9e82-d458ed9ba797"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.468469 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "e20b1f7a-2627-4bde-9e82-d458ed9ba797" (UID: "e20b1f7a-2627-4bde-9e82-d458ed9ba797"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.475966 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.480980 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.481074 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.487931 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e20b1f7a-2627-4bde-9e82-d458ed9ba797-kube-api-access-4gfvl" (OuterVolumeSpecName: "kube-api-access-4gfvl") pod "e20b1f7a-2627-4bde-9e82-d458ed9ba797" (UID: "e20b1f7a-2627-4bde-9e82-d458ed9ba797"). InnerVolumeSpecName "kube-api-access-4gfvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.499674 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9pc\" (UniqueName: \"kubernetes.io/projected/90dd9df8-232b-4a2c-a750-4ad7209404b3-kube-api-access-vq9pc\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.500414 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.530052 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.533351 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e20b1f7a-2627-4bde-9e82-d458ed9ba797" (UID: "e20b1f7a-2627-4bde-9e82-d458ed9ba797"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.558437 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e20b1f7a-2627-4bde-9e82-d458ed9ba797-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.558499 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e20b1f7a-2627-4bde-9e82-d458ed9ba797-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.558537 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.558601 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gfvl\" (UniqueName: \"kubernetes.io/projected/e20b1f7a-2627-4bde-9e82-d458ed9ba797-kube-api-access-4gfvl\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.558617 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.558629 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.582069 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.585203 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.591195 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-config-data" (OuterVolumeSpecName: "config-data") pod "e20b1f7a-2627-4bde-9e82-d458ed9ba797" (UID: "e20b1f7a-2627-4bde-9e82-d458ed9ba797"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.663204 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.663495 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20b1f7a-2627-4bde-9e82-d458ed9ba797-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.834017 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e20b1f7a-2627-4bde-9e82-d458ed9ba797","Type":"ContainerDied","Data":"4cc357a1c47bf0d4b2978f8b262c858ec9a8694e0e4f8aab2e860013fd6ce8a9"} Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.834044 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.834104 4717 scope.go:117] "RemoveContainer" containerID="9fbdb66667073249b7bc449195c2e674fbfa11d81da4690ca02c5db423a1f344" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.838530 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c77f44b65-wwnph" event={"ID":"4e706234-18f2-467c-8681-79402d25eb1c","Type":"ContainerStarted","Data":"5b321fba023192bfa1cfb97df9238f2d1cd260a5dfa1a7792d062cbe0d32c773"} Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.850945 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"983f7c403cd3db479e3c850606e5e72987f9b662a70620c3becf49f8d098c841"} Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.892154 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.947225 4717 scope.go:117] "RemoveContainer" containerID="0d1f0ef8d7b9c63b8f044e715fb953a819d1ef8576839e32a5ec268a89c1029b" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.952136 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.989132 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:12:00 crc kubenswrapper[4717]: E0217 15:12:00.989782 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20b1f7a-2627-4bde-9e82-d458ed9ba797" containerName="glance-httpd" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.989797 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20b1f7a-2627-4bde-9e82-d458ed9ba797" containerName="glance-httpd" Feb 17 15:12:00 crc kubenswrapper[4717]: E0217 15:12:00.989811 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20b1f7a-2627-4bde-9e82-d458ed9ba797" containerName="glance-log" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.989817 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20b1f7a-2627-4bde-9e82-d458ed9ba797" containerName="glance-log" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.990058 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20b1f7a-2627-4bde-9e82-d458ed9ba797" containerName="glance-httpd" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.990088 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20b1f7a-2627-4bde-9e82-d458ed9ba797" containerName="glance-log" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.991139 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.994721 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 15:12:00 crc kubenswrapper[4717]: I0217 15:12:00.995134 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.009664 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.077717 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.077787 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c915fe-fb2a-4325-86c0-165aa90cf60f-logs\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.077835 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.077866 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c915fe-fb2a-4325-86c0-165aa90cf60f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.077926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.077956 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.078003 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb6cj\" (UniqueName: \"kubernetes.io/projected/c7c915fe-fb2a-4325-86c0-165aa90cf60f-kube-api-access-cb6cj\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.078026 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.179210 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.179298 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb6cj\" (UniqueName: \"kubernetes.io/projected/c7c915fe-fb2a-4325-86c0-165aa90cf60f-kube-api-access-cb6cj\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.179332 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.179369 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.179417 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c915fe-fb2a-4325-86c0-165aa90cf60f-logs\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.179457 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.179495 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c915fe-fb2a-4325-86c0-165aa90cf60f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.179573 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.182266 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c915fe-fb2a-4325-86c0-165aa90cf60f-logs\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.182637 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.190525 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c915fe-fb2a-4325-86c0-165aa90cf60f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.192245 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.192273 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.193157 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.199868 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.209282 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.218746 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb6cj\" (UniqueName: \"kubernetes.io/projected/c7c915fe-fb2a-4325-86c0-165aa90cf60f-kube-api-access-cb6cj\") pod \"glance-default-external-api-0\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.322707 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.413369 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.862569 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619aaea4-acc6-488e-bb45-6406861b48db" path="/var/lib/kubelet/pods/619aaea4-acc6-488e-bb45-6406861b48db/volumes" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.863536 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e20b1f7a-2627-4bde-9e82-d458ed9ba797" path="/var/lib/kubelet/pods/e20b1f7a-2627-4bde-9e82-d458ed9ba797/volumes" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.870916 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:12:01 crc kubenswrapper[4717]: I0217 15:12:01.896933 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c77f44b65-wwnph" podStartSLOduration=6.89691614 podStartE2EDuration="6.89691614s" podCreationTimestamp="2026-02-17 15:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:01.890038135 +0000 UTC m=+1188.305878621" watchObservedRunningTime="2026-02-17 15:12:01.89691614 +0000 UTC m=+1188.312756616" Feb 17 15:12:02 crc kubenswrapper[4717]: I0217 15:12:02.884250 4717 generic.go:334] "Generic (PLEG): container finished" podID="7055a012-2f5d-4ba2-b56d-a9ec73e11944" containerID="967062522ba9d55030f87bbfbcaf4f9488d1ddd65c5f0052dcc040e5f828b911" exitCode=0 Feb 17 15:12:02 crc kubenswrapper[4717]: I0217 15:12:02.884439 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fg4vl" event={"ID":"7055a012-2f5d-4ba2-b56d-a9ec73e11944","Type":"ContainerDied","Data":"967062522ba9d55030f87bbfbcaf4f9488d1ddd65c5f0052dcc040e5f828b911"} Feb 17 15:12:02 crc kubenswrapper[4717]: I0217 15:12:02.887931 4717 generic.go:334] "Generic (PLEG): container finished" podID="de8aa40e-4ead-46b8-a94b-0a3602d030ef" containerID="649701b8f1b2cab221feb3d8ed2d837aee0b938775799a8853384119a3b8a680" exitCode=0 Feb 17 15:12:02 crc kubenswrapper[4717]: I0217 15:12:02.887979 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6ncb2" event={"ID":"de8aa40e-4ead-46b8-a94b-0a3602d030ef","Type":"ContainerDied","Data":"649701b8f1b2cab221feb3d8ed2d837aee0b938775799a8853384119a3b8a680"} Feb 17 15:12:02 crc kubenswrapper[4717]: I0217 15:12:02.897747 4717 generic.go:334] "Generic (PLEG): container finished" podID="de3759e2-5565-40bd-8b02-f3c1f0d55863" containerID="64148e567539ee53dc679886de02850eb725d2b861ae53a9715e8804030baf57" exitCode=0 Feb 17 15:12:02 crc kubenswrapper[4717]: I0217 15:12:02.898638 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gxq6g" event={"ID":"de3759e2-5565-40bd-8b02-f3c1f0d55863","Type":"ContainerDied","Data":"64148e567539ee53dc679886de02850eb725d2b861ae53a9715e8804030baf57"} Feb 17 15:12:03 crc kubenswrapper[4717]: I0217 15:12:03.219595 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:12:03 crc kubenswrapper[4717]: I0217 15:12:03.316840 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-nm8xp"] Feb 17 15:12:03 crc kubenswrapper[4717]: I0217 15:12:03.317098 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" podUID="ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" containerName="dnsmasq-dns" containerID="cri-o://11b5defdab6b7391649212a44d32ee62f7660f387b887af49b09b3e843a3c95e" gracePeriod=10 Feb 17 15:12:03 crc kubenswrapper[4717]: I0217 15:12:03.917501 4717 generic.go:334] "Generic (PLEG): container finished" podID="ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" containerID="11b5defdab6b7391649212a44d32ee62f7660f387b887af49b09b3e843a3c95e" exitCode=0 Feb 17 15:12:03 crc kubenswrapper[4717]: I0217 15:12:03.917598 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" event={"ID":"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf","Type":"ContainerDied","Data":"11b5defdab6b7391649212a44d32ee62f7660f387b887af49b09b3e843a3c95e"} Feb 17 15:12:03 crc kubenswrapper[4717]: I0217 15:12:03.925925 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"d8f9d1c6411e62d1daee7c3197e09ca028255b79ba24108f1612e0e54576728c"} Feb 17 15:12:05 crc kubenswrapper[4717]: I0217 15:12:05.015060 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:12:05 crc kubenswrapper[4717]: I0217 15:12:05.015141 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:12:05 crc kubenswrapper[4717]: I0217 15:12:05.085889 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:12:05 crc kubenswrapper[4717]: I0217 15:12:05.085943 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:12:06 crc kubenswrapper[4717]: W0217 15:12:06.596242 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90dd9df8_232b_4a2c_a750_4ad7209404b3.slice/crio-64194f8ca0f321a2172a16a43c665f7f2dfdc1c8d447788cd550fc543cfed354 WatchSource:0}: Error finding container 64194f8ca0f321a2172a16a43c665f7f2dfdc1c8d447788cd550fc543cfed354: Status 404 returned error can't find the container with id 64194f8ca0f321a2172a16a43c665f7f2dfdc1c8d447788cd550fc543cfed354 Feb 17 15:12:06 crc kubenswrapper[4717]: I0217 15:12:06.910054 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fg4vl" Feb 17 15:12:06 crc kubenswrapper[4717]: I0217 15:12:06.958644 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gxq6g" event={"ID":"de3759e2-5565-40bd-8b02-f3c1f0d55863","Type":"ContainerDied","Data":"44c38b1174eb9c38869dba5643b9531bc72a5d3852a2b927ec79a12eee867a9d"} Feb 17 15:12:06 crc kubenswrapper[4717]: I0217 15:12:06.958678 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c38b1174eb9c38869dba5643b9531bc72a5d3852a2b927ec79a12eee867a9d" Feb 17 15:12:06 crc kubenswrapper[4717]: I0217 15:12:06.961103 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"90dd9df8-232b-4a2c-a750-4ad7209404b3","Type":"ContainerStarted","Data":"64194f8ca0f321a2172a16a43c665f7f2dfdc1c8d447788cd550fc543cfed354"} Feb 17 15:12:06 crc kubenswrapper[4717]: I0217 15:12:06.966202 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fg4vl" event={"ID":"7055a012-2f5d-4ba2-b56d-a9ec73e11944","Type":"ContainerDied","Data":"24e453bb9c17cb29fc16601523d5ab38c4e4deb98e91ebc82bfdd93d9a64b7d4"} Feb 17 15:12:06 crc kubenswrapper[4717]: I0217 15:12:06.966236 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24e453bb9c17cb29fc16601523d5ab38c4e4deb98e91ebc82bfdd93d9a64b7d4" Feb 17 15:12:06 crc kubenswrapper[4717]: I0217 15:12:06.966326 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fg4vl" Feb 17 15:12:06 crc kubenswrapper[4717]: I0217 15:12:06.973360 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6ncb2" event={"ID":"de8aa40e-4ead-46b8-a94b-0a3602d030ef","Type":"ContainerDied","Data":"ac49e956de028d3062244d4a3ecd741f99e3bc7bda0918e1f2939ad253194131"} Feb 17 15:12:06 crc kubenswrapper[4717]: I0217 15:12:06.973404 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac49e956de028d3062244d4a3ecd741f99e3bc7bda0918e1f2939ad253194131" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.008325 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.043506 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.052372 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-config-data\") pod \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.052424 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-combined-ca-bundle\") pod \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.052832 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7055a012-2f5d-4ba2-b56d-a9ec73e11944-logs\") pod \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.052867 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thgx7\" (UniqueName: \"kubernetes.io/projected/7055a012-2f5d-4ba2-b56d-a9ec73e11944-kube-api-access-thgx7\") pod \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.052905 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-scripts\") pod \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\" (UID: \"7055a012-2f5d-4ba2-b56d-a9ec73e11944\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.053737 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7055a012-2f5d-4ba2-b56d-a9ec73e11944-logs" (OuterVolumeSpecName: "logs") pod "7055a012-2f5d-4ba2-b56d-a9ec73e11944" (UID: "7055a012-2f5d-4ba2-b56d-a9ec73e11944"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.059218 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-scripts" (OuterVolumeSpecName: "scripts") pod "7055a012-2f5d-4ba2-b56d-a9ec73e11944" (UID: "7055a012-2f5d-4ba2-b56d-a9ec73e11944"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.065672 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7055a012-2f5d-4ba2-b56d-a9ec73e11944-kube-api-access-thgx7" (OuterVolumeSpecName: "kube-api-access-thgx7") pod "7055a012-2f5d-4ba2-b56d-a9ec73e11944" (UID: "7055a012-2f5d-4ba2-b56d-a9ec73e11944"). InnerVolumeSpecName "kube-api-access-thgx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.070073 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.106963 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7055a012-2f5d-4ba2-b56d-a9ec73e11944" (UID: "7055a012-2f5d-4ba2-b56d-a9ec73e11944"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.107503 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-config-data" (OuterVolumeSpecName: "config-data") pod "7055a012-2f5d-4ba2-b56d-a9ec73e11944" (UID: "7055a012-2f5d-4ba2-b56d-a9ec73e11944"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.155565 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmc6f\" (UniqueName: \"kubernetes.io/projected/de8aa40e-4ead-46b8-a94b-0a3602d030ef-kube-api-access-qmc6f\") pod \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\" (UID: \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.155625 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8aa40e-4ead-46b8-a94b-0a3602d030ef-combined-ca-bundle\") pod \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\" (UID: \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.155678 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-config-data\") pod \"de3759e2-5565-40bd-8b02-f3c1f0d55863\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.155721 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzfdw\" (UniqueName: \"kubernetes.io/projected/de3759e2-5565-40bd-8b02-f3c1f0d55863-kube-api-access-gzfdw\") pod \"de3759e2-5565-40bd-8b02-f3c1f0d55863\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.155744 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-credential-keys\") pod \"de3759e2-5565-40bd-8b02-f3c1f0d55863\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.155835 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-fernet-keys\") pod \"de3759e2-5565-40bd-8b02-f3c1f0d55863\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.155951 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-combined-ca-bundle\") pod \"de3759e2-5565-40bd-8b02-f3c1f0d55863\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.155968 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-scripts\") pod \"de3759e2-5565-40bd-8b02-f3c1f0d55863\" (UID: \"de3759e2-5565-40bd-8b02-f3c1f0d55863\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.156022 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de8aa40e-4ead-46b8-a94b-0a3602d030ef-db-sync-config-data\") pod \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\" (UID: \"de8aa40e-4ead-46b8-a94b-0a3602d030ef\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.156344 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7055a012-2f5d-4ba2-b56d-a9ec73e11944-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.156359 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thgx7\" (UniqueName: \"kubernetes.io/projected/7055a012-2f5d-4ba2-b56d-a9ec73e11944-kube-api-access-thgx7\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.156418 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.156427 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.156435 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7055a012-2f5d-4ba2-b56d-a9ec73e11944-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.162511 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8aa40e-4ead-46b8-a94b-0a3602d030ef-kube-api-access-qmc6f" (OuterVolumeSpecName: "kube-api-access-qmc6f") pod "de8aa40e-4ead-46b8-a94b-0a3602d030ef" (UID: "de8aa40e-4ead-46b8-a94b-0a3602d030ef"). InnerVolumeSpecName "kube-api-access-qmc6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.162627 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "de3759e2-5565-40bd-8b02-f3c1f0d55863" (UID: "de3759e2-5565-40bd-8b02-f3c1f0d55863"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.165388 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "de3759e2-5565-40bd-8b02-f3c1f0d55863" (UID: "de3759e2-5565-40bd-8b02-f3c1f0d55863"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.167192 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-scripts" (OuterVolumeSpecName: "scripts") pod "de3759e2-5565-40bd-8b02-f3c1f0d55863" (UID: "de3759e2-5565-40bd-8b02-f3c1f0d55863"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.171183 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8aa40e-4ead-46b8-a94b-0a3602d030ef-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "de8aa40e-4ead-46b8-a94b-0a3602d030ef" (UID: "de8aa40e-4ead-46b8-a94b-0a3602d030ef"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.175371 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de3759e2-5565-40bd-8b02-f3c1f0d55863-kube-api-access-gzfdw" (OuterVolumeSpecName: "kube-api-access-gzfdw") pod "de3759e2-5565-40bd-8b02-f3c1f0d55863" (UID: "de3759e2-5565-40bd-8b02-f3c1f0d55863"). InnerVolumeSpecName "kube-api-access-gzfdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.188096 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8aa40e-4ead-46b8-a94b-0a3602d030ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de8aa40e-4ead-46b8-a94b-0a3602d030ef" (UID: "de8aa40e-4ead-46b8-a94b-0a3602d030ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.202224 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de3759e2-5565-40bd-8b02-f3c1f0d55863" (UID: "de3759e2-5565-40bd-8b02-f3c1f0d55863"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.210056 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-config-data" (OuterVolumeSpecName: "config-data") pod "de3759e2-5565-40bd-8b02-f3c1f0d55863" (UID: "de3759e2-5565-40bd-8b02-f3c1f0d55863"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.257800 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-dns-svc\") pod \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258227 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-ovsdbserver-nb\") pod \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258296 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-config\") pod \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258403 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-ovsdbserver-sb\") pod \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258429 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggl4k\" (UniqueName: \"kubernetes.io/projected/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-kube-api-access-ggl4k\") pod \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\" (UID: \"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf\") " Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258837 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de8aa40e-4ead-46b8-a94b-0a3602d030ef-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258857 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmc6f\" (UniqueName: \"kubernetes.io/projected/de8aa40e-4ead-46b8-a94b-0a3602d030ef-kube-api-access-qmc6f\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258870 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8aa40e-4ead-46b8-a94b-0a3602d030ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258882 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258893 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzfdw\" (UniqueName: \"kubernetes.io/projected/de3759e2-5565-40bd-8b02-f3c1f0d55863-kube-api-access-gzfdw\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258901 4717 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258910 4717 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258919 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.258927 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de3759e2-5565-40bd-8b02-f3c1f0d55863-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.263995 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-kube-api-access-ggl4k" (OuterVolumeSpecName: "kube-api-access-ggl4k") pod "ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" (UID: "ef7c0b4b-a1e5-4449-be56-928ccc11dbaf"). InnerVolumeSpecName "kube-api-access-ggl4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.321057 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.360370 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggl4k\" (UniqueName: \"kubernetes.io/projected/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-kube-api-access-ggl4k\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.401031 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" (UID: "ef7c0b4b-a1e5-4449-be56-928ccc11dbaf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.401270 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" (UID: "ef7c0b4b-a1e5-4449-be56-928ccc11dbaf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.411473 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" (UID: "ef7c0b4b-a1e5-4449-be56-928ccc11dbaf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.412289 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-config" (OuterVolumeSpecName: "config") pod "ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" (UID: "ef7c0b4b-a1e5-4449-be56-928ccc11dbaf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.462200 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.462230 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.462239 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:07 crc kubenswrapper[4717]: I0217 15:12:07.462247 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.011121 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"90dd9df8-232b-4a2c-a750-4ad7209404b3","Type":"ContainerStarted","Data":"e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc"} Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.026985 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-567bd59758-rrcxt"] Feb 17 15:12:08 crc kubenswrapper[4717]: E0217 15:12:08.027416 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8aa40e-4ead-46b8-a94b-0a3602d030ef" containerName="barbican-db-sync" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.027429 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8aa40e-4ead-46b8-a94b-0a3602d030ef" containerName="barbican-db-sync" Feb 17 15:12:08 crc kubenswrapper[4717]: E0217 15:12:08.027437 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" containerName="dnsmasq-dns" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.027443 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" containerName="dnsmasq-dns" Feb 17 15:12:08 crc kubenswrapper[4717]: E0217 15:12:08.027459 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" containerName="init" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.027467 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" containerName="init" Feb 17 15:12:08 crc kubenswrapper[4717]: E0217 15:12:08.027489 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3759e2-5565-40bd-8b02-f3c1f0d55863" containerName="keystone-bootstrap" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.027496 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3759e2-5565-40bd-8b02-f3c1f0d55863" containerName="keystone-bootstrap" Feb 17 15:12:08 crc kubenswrapper[4717]: E0217 15:12:08.027505 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7055a012-2f5d-4ba2-b56d-a9ec73e11944" containerName="placement-db-sync" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.027510 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7055a012-2f5d-4ba2-b56d-a9ec73e11944" containerName="placement-db-sync" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.027669 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8aa40e-4ead-46b8-a94b-0a3602d030ef" containerName="barbican-db-sync" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.027680 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7055a012-2f5d-4ba2-b56d-a9ec73e11944" containerName="placement-db-sync" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.027693 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" containerName="dnsmasq-dns" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.027701 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3759e2-5565-40bd-8b02-f3c1f0d55863" containerName="keystone-bootstrap" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.028569 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.037309 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" event={"ID":"ef7c0b4b-a1e5-4449-be56-928ccc11dbaf","Type":"ContainerDied","Data":"f57eba55ebd2a933abdd06cf55eb504eaa77e5d70d50f6f2fcf4a1abab56249d"} Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.037364 4717 scope.go:117] "RemoveContainer" containerID="11b5defdab6b7391649212a44d32ee62f7660f387b887af49b09b3e843a3c95e" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.037479 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.041735 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-v2976" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.062555 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-567bd59758-rrcxt"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.062940 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.063183 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.063396 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.069376 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1","Type":"ContainerStarted","Data":"04306c04a60664c6f4bcf509c853df0bfc861b7f61f0d79be121eacc4762382e"} Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.069442 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.094424 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"fcfdcd62ff48857ee9331bd9ab361564a05884d49ee4bf1763d7a8011db5934d"} Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.094474 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"57e3ea4a678736237086878add23824a87fa180ec86e76aa605caf029518c1ae"} Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.103958 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gxq6g" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.104875 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c915fe-fb2a-4325-86c0-165aa90cf60f","Type":"ContainerStarted","Data":"623a425c8d49ca98e0cc6a63ae67a8dda00da468286a3c09906f11f56cdf7540"} Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.104989 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6ncb2" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.134272 4717 scope.go:117] "RemoveContainer" containerID="af2920544484512942f30b3deaa6859c6b4094ddbfdc0fa8712cd66cca946fce" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.149475 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-nm8xp"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.157220 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-nm8xp"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.186042 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-internal-tls-certs\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.186109 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-public-tls-certs\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.186171 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/932dfe25-b5f6-4cc6-92f7-6594c8449263-logs\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.186215 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-scripts\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.186245 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99kv4\" (UniqueName: \"kubernetes.io/projected/932dfe25-b5f6-4cc6-92f7-6594c8449263-kube-api-access-99kv4\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.186263 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-config-data\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.186286 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-combined-ca-bundle\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.281018 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-96f9cc575-vd9jv"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.284903 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.287251 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-internal-tls-certs\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.287305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-public-tls-certs\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.287384 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/932dfe25-b5f6-4cc6-92f7-6594c8449263-logs\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.287426 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-scripts\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.287460 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99kv4\" (UniqueName: \"kubernetes.io/projected/932dfe25-b5f6-4cc6-92f7-6594c8449263-kube-api-access-99kv4\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.287477 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-config-data\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.287497 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-combined-ca-bundle\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.287966 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/932dfe25-b5f6-4cc6-92f7-6594c8449263-logs\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.296588 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-scripts\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.296800 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-combined-ca-bundle\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.298850 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-internal-tls-certs\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.302487 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.302731 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.319426 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.321022 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sdkz9" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.323416 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.324895 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-config-data\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.326385 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.352907 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99kv4\" (UniqueName: \"kubernetes.io/projected/932dfe25-b5f6-4cc6-92f7-6594c8449263-kube-api-access-99kv4\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.370224 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-public-tls-certs\") pod \"placement-567bd59758-rrcxt\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.393289 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-96f9cc575-vd9jv"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.398572 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-credential-keys\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.398622 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbpd2\" (UniqueName: \"kubernetes.io/projected/c8ee178e-3086-48e3-ad91-aefa00e0d10e-kube-api-access-zbpd2\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.398643 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-fernet-keys\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.398687 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-config-data\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.398713 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-combined-ca-bundle\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.398743 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-internal-tls-certs\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.398783 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-scripts\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.398831 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-public-tls-certs\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.510210 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-scripts\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.510276 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-public-tls-certs\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.510317 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-credential-keys\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.510337 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbpd2\" (UniqueName: \"kubernetes.io/projected/c8ee178e-3086-48e3-ad91-aefa00e0d10e-kube-api-access-zbpd2\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.510355 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-fernet-keys\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.510393 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-config-data\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.510417 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-combined-ca-bundle\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.510447 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-internal-tls-certs\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.515623 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-internal-tls-certs\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.518548 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-scripts\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.525062 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-config-data\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.526433 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-combined-ca-bundle\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.531469 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-credential-keys\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.538698 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-public-tls-certs\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.539268 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8ee178e-3086-48e3-ad91-aefa00e0d10e-fernet-keys\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.550875 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-b587759fd-cw959"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.552240 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.604959 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.604978 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.614135 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vpgkk" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.622662 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4300fb-9575-47b9-89a3-ca587932319f-logs\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.622766 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-config-data-custom\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.622828 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-combined-ca-bundle\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.622876 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffq8s\" (UniqueName: \"kubernetes.io/projected/bb4300fb-9575-47b9-89a3-ca587932319f-kube-api-access-ffq8s\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.622920 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-config-data\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.643129 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbpd2\" (UniqueName: \"kubernetes.io/projected/c8ee178e-3086-48e3-ad91-aefa00e0d10e-kube-api-access-zbpd2\") pod \"keystone-96f9cc575-vd9jv\" (UID: \"c8ee178e-3086-48e3-ad91-aefa00e0d10e\") " pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.649283 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.685935 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-b587759fd-cw959"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.716183 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6b7b6cb599-wtwlr"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.717906 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.727104 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-config-data\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.727146 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4300fb-9575-47b9-89a3-ca587932319f-logs\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.727252 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-config-data-custom\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.727308 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-combined-ca-bundle\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.727346 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffq8s\" (UniqueName: \"kubernetes.io/projected/bb4300fb-9575-47b9-89a3-ca587932319f-kube-api-access-ffq8s\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.729367 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.730018 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4300fb-9575-47b9-89a3-ca587932319f-logs\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.750168 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b7b6cb599-wtwlr"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.758300 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-config-data\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.768954 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-config-data-custom\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.772402 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.775825 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffq8s\" (UniqueName: \"kubernetes.io/projected/bb4300fb-9575-47b9-89a3-ca587932319f-kube-api-access-ffq8s\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.801142 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7ff59556db-hl8w5"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.802382 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-combined-ca-bundle\") pod \"barbican-keystone-listener-b587759fd-cw959\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.802764 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.805056 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.829806 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-config-data-custom\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.829847 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlhkr\" (UniqueName: \"kubernetes.io/projected/914980ae-b392-46a5-bf14-0a80569736f1-kube-api-access-rlhkr\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.829889 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-config-data-custom\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.830039 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/914980ae-b392-46a5-bf14-0a80569736f1-logs\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.830107 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-config-data\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.830421 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-combined-ca-bundle\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.830465 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj287\" (UniqueName: \"kubernetes.io/projected/f0360385-5ad6-4d9b-b825-1cae0c23d814-kube-api-access-xj287\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.830505 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0360385-5ad6-4d9b-b825-1cae0c23d814-logs\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.830533 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-config-data\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.830610 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-combined-ca-bundle\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.838291 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-fw7hr"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.840122 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.870137 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-fw7hr"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.882727 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.896525 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7ff59556db-hl8w5"] Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.934624 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-combined-ca-bundle\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.934671 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj287\" (UniqueName: \"kubernetes.io/projected/f0360385-5ad6-4d9b-b825-1cae0c23d814-kube-api-access-xj287\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.934691 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0360385-5ad6-4d9b-b825-1cae0c23d814-logs\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.934713 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-config-data\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.934743 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-combined-ca-bundle\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.934788 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-config-data-custom\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.934810 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlhkr\" (UniqueName: \"kubernetes.io/projected/914980ae-b392-46a5-bf14-0a80569736f1-kube-api-access-rlhkr\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.934839 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-config-data-custom\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.934884 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/914980ae-b392-46a5-bf14-0a80569736f1-logs\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.934911 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-config-data\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.950520 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-config-data\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.958378 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/914980ae-b392-46a5-bf14-0a80569736f1-logs\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.958611 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0360385-5ad6-4d9b-b825-1cae0c23d814-logs\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.962535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-combined-ca-bundle\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.967993 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-config-data-custom\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.972865 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-config-data-custom\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.981019 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-combined-ca-bundle\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.986556 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlhkr\" (UniqueName: \"kubernetes.io/projected/914980ae-b392-46a5-bf14-0a80569736f1-kube-api-access-rlhkr\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:08 crc kubenswrapper[4717]: I0217 15:12:08.987559 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-config-data\") pod \"barbican-worker-6b7b6cb599-wtwlr\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.017716 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj287\" (UniqueName: \"kubernetes.io/projected/f0360385-5ad6-4d9b-b825-1cae0c23d814-kube-api-access-xj287\") pod \"barbican-api-7ff59556db-hl8w5\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.026099 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-96ddcd589-gq7qg"] Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.027821 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.038328 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-678d9d64d6-sj9md"] Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.040274 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.041377 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-config\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.041485 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-dns-svc\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.041505 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-ovsdbserver-sb\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.041723 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-ovsdbserver-nb\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.041752 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99fjn\" (UniqueName: \"kubernetes.io/projected/46311c1e-077b-4089-8a6e-c8497cd5b796-kube-api-access-99fjn\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.130970 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-96ddcd589-gq7qg"] Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161290 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3b8192-afd0-44c1-886f-7dc072112460-config-data\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161345 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3b8192-afd0-44c1-886f-7dc072112460-combined-ca-bundle\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161368 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dbbn\" (UniqueName: \"kubernetes.io/projected/492fb01b-1889-47ff-b334-750b9614cc60-kube-api-access-6dbbn\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161395 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-ovsdbserver-nb\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161422 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99fjn\" (UniqueName: \"kubernetes.io/projected/46311c1e-077b-4089-8a6e-c8497cd5b796-kube-api-access-99fjn\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161470 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b3b8192-afd0-44c1-886f-7dc072112460-config-data-custom\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161490 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492fb01b-1889-47ff-b334-750b9614cc60-combined-ca-bundle\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161548 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492fb01b-1889-47ff-b334-750b9614cc60-config-data\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161572 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-config\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161600 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b3b8192-afd0-44c1-886f-7dc072112460-logs\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161626 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-dns-svc\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161647 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-ovsdbserver-sb\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161665 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phmc9\" (UniqueName: \"kubernetes.io/projected/0b3b8192-afd0-44c1-886f-7dc072112460-kube-api-access-phmc9\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161700 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/492fb01b-1889-47ff-b334-750b9614cc60-logs\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.161747 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/492fb01b-1889-47ff-b334-750b9614cc60-config-data-custom\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.163806 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-config\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.163869 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-ovsdbserver-nb\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.164680 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-ovsdbserver-sb\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.165030 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-678d9d64d6-sj9md"] Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.165802 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-dns-svc\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.173518 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c915fe-fb2a-4325-86c0-165aa90cf60f","Type":"ContainerStarted","Data":"bbf2b95657c669613cc3cb5c6dfdbdd07008aa6a16efec98184347f95be3369c"} Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.190655 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7977cbd67d-28z6x"] Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.192425 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"90dd9df8-232b-4a2c-a750-4ad7209404b3","Type":"ContainerStarted","Data":"842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789"} Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.192597 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.193458 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99fjn\" (UniqueName: \"kubernetes.io/projected/46311c1e-077b-4089-8a6e-c8497cd5b796-kube-api-access-99fjn\") pod \"dnsmasq-dns-7d649d8c65-fw7hr\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.207646 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7977cbd67d-28z6x"] Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.227583 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.249715 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.249693676 podStartE2EDuration="9.249693676s" podCreationTimestamp="2026-02-17 15:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:09.217141743 +0000 UTC m=+1195.632982239" watchObservedRunningTime="2026-02-17 15:12:09.249693676 +0000 UTC m=+1195.665534142" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.263770 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmc9\" (UniqueName: \"kubernetes.io/projected/0b3b8192-afd0-44c1-886f-7dc072112460-kube-api-access-phmc9\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.263862 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/492fb01b-1889-47ff-b334-750b9614cc60-logs\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.263909 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/492fb01b-1889-47ff-b334-750b9614cc60-config-data-custom\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.263960 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3b8192-afd0-44c1-886f-7dc072112460-config-data\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.263997 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3b8192-afd0-44c1-886f-7dc072112460-combined-ca-bundle\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.264018 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dbbn\" (UniqueName: \"kubernetes.io/projected/492fb01b-1889-47ff-b334-750b9614cc60-kube-api-access-6dbbn\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.264145 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b3b8192-afd0-44c1-886f-7dc072112460-config-data-custom\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.264165 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492fb01b-1889-47ff-b334-750b9614cc60-combined-ca-bundle\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.264225 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492fb01b-1889-47ff-b334-750b9614cc60-config-data\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.264298 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b3b8192-afd0-44c1-886f-7dc072112460-logs\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.264826 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b3b8192-afd0-44c1-886f-7dc072112460-logs\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.275992 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3b8192-afd0-44c1-886f-7dc072112460-config-data\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.278698 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/492fb01b-1889-47ff-b334-750b9614cc60-logs\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.281581 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b3b8192-afd0-44c1-886f-7dc072112460-config-data-custom\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.282019 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.282608 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/492fb01b-1889-47ff-b334-750b9614cc60-config-data-custom\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.284520 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492fb01b-1889-47ff-b334-750b9614cc60-combined-ca-bundle\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.292230 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492fb01b-1889-47ff-b334-750b9614cc60-config-data\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.292770 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3b8192-afd0-44c1-886f-7dc072112460-combined-ca-bundle\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.294670 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmc9\" (UniqueName: \"kubernetes.io/projected/0b3b8192-afd0-44c1-886f-7dc072112460-kube-api-access-phmc9\") pod \"barbican-worker-96ddcd589-gq7qg\" (UID: \"0b3b8192-afd0-44c1-886f-7dc072112460\") " pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.310099 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dbbn\" (UniqueName: \"kubernetes.io/projected/492fb01b-1889-47ff-b334-750b9614cc60-kube-api-access-6dbbn\") pod \"barbican-keystone-listener-678d9d64d6-sj9md\" (UID: \"492fb01b-1889-47ff-b334-750b9614cc60\") " pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.327998 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.367031 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-combined-ca-bundle\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.367383 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-config-data-custom\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.367437 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sjrn\" (UniqueName: \"kubernetes.io/projected/db14c6ee-db55-4d0e-ace5-888795028d98-kube-api-access-5sjrn\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.367571 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db14c6ee-db55-4d0e-ace5-888795028d98-logs\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.367606 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-config-data\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.411593 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-96ddcd589-gq7qg" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.416835 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-567bd59758-rrcxt"] Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.426063 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.473040 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-combined-ca-bundle\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.475831 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-config-data-custom\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.475878 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sjrn\" (UniqueName: \"kubernetes.io/projected/db14c6ee-db55-4d0e-ace5-888795028d98-kube-api-access-5sjrn\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.476018 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db14c6ee-db55-4d0e-ace5-888795028d98-logs\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.476044 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-config-data\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.478000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db14c6ee-db55-4d0e-ace5-888795028d98-logs\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.481510 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-config-data-custom\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.483763 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-config-data\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.484600 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-combined-ca-bundle\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.506269 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sjrn\" (UniqueName: \"kubernetes.io/projected/db14c6ee-db55-4d0e-ace5-888795028d98-kube-api-access-5sjrn\") pod \"barbican-api-7977cbd67d-28z6x\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.538187 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.713887 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-b587759fd-cw959"] Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.802847 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-96f9cc575-vd9jv"] Feb 17 15:12:09 crc kubenswrapper[4717]: I0217 15:12:09.908553 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" path="/var/lib/kubelet/pods/ef7c0b4b-a1e5-4449-be56-928ccc11dbaf/volumes" Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.229598 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-96f9cc575-vd9jv" event={"ID":"c8ee178e-3086-48e3-ad91-aefa00e0d10e","Type":"ContainerStarted","Data":"9e59b3258862cdeb04f888d63f8c0703bb0890654727a8e090c9956dfc7d7c9f"} Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.237653 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b587759fd-cw959" event={"ID":"bb4300fb-9575-47b9-89a3-ca587932319f","Type":"ContainerStarted","Data":"28d9d8c2590431d72526cd361b4482ee3915e255478dd31295da28098ddf1b39"} Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.242366 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-567bd59758-rrcxt" event={"ID":"932dfe25-b5f6-4cc6-92f7-6594c8449263","Type":"ContainerStarted","Data":"6df84643c8d7bebb2c7877075b5b1812acd717d08ec60c1df1651f1d8d0ee248"} Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.256631 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7ff59556db-hl8w5"] Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.281739 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b7b6cb599-wtwlr"] Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.334590 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-96ddcd589-gq7qg"] Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.357071 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-fw7hr"] Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.483539 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-678d9d64d6-sj9md"] Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.493631 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7977cbd67d-28z6x"] Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.584056 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.584335 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.636990 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 15:12:10 crc kubenswrapper[4717]: I0217 15:12:10.647295 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 15:12:10 crc kubenswrapper[4717]: W0217 15:12:10.912111 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46311c1e_077b_4089_8a6e_c8497cd5b796.slice/crio-e3849f4fbbee51e34126e46baadd483453d9fe79d64beeaccdb2f94b9fd351fe WatchSource:0}: Error finding container e3849f4fbbee51e34126e46baadd483453d9fe79d64beeaccdb2f94b9fd351fe: Status 404 returned error can't find the container with id e3849f4fbbee51e34126e46baadd483453d9fe79d64beeaccdb2f94b9fd351fe Feb 17 15:12:10 crc kubenswrapper[4717]: W0217 15:12:10.918468 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0360385_5ad6_4d9b_b825_1cae0c23d814.slice/crio-fb05ee5e24f80c254a9d3ad89c1d5d31327d0b1c77487b2f8d04f1fa4d88b04a WatchSource:0}: Error finding container fb05ee5e24f80c254a9d3ad89c1d5d31327d0b1c77487b2f8d04f1fa4d88b04a: Status 404 returned error can't find the container with id fb05ee5e24f80c254a9d3ad89c1d5d31327d0b1c77487b2f8d04f1fa4d88b04a Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.267113 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" event={"ID":"46311c1e-077b-4089-8a6e-c8497cd5b796","Type":"ContainerStarted","Data":"e3849f4fbbee51e34126e46baadd483453d9fe79d64beeaccdb2f94b9fd351fe"} Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.272200 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-96ddcd589-gq7qg" event={"ID":"0b3b8192-afd0-44c1-886f-7dc072112460","Type":"ContainerStarted","Data":"16d202f71a56660b58834635e7199def35bca66c5beb4527884e1c6aaa8e918f"} Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.283481 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" event={"ID":"914980ae-b392-46a5-bf14-0a80569736f1","Type":"ContainerStarted","Data":"b1813be82f008b47a6087379b0e8d011a5bf7fe4b5c837887018fca5b0b37a37"} Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.292256 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff59556db-hl8w5" event={"ID":"f0360385-5ad6-4d9b-b825-1cae0c23d814","Type":"ContainerStarted","Data":"fb05ee5e24f80c254a9d3ad89c1d5d31327d0b1c77487b2f8d04f1fa4d88b04a"} Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.300353 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7977cbd67d-28z6x" event={"ID":"db14c6ee-db55-4d0e-ace5-888795028d98","Type":"ContainerStarted","Data":"ccf86e9aefee981da8b75b01c365cca800c9ed59e713b7b3acbf2db3b3585c7b"} Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.314043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s5lzg" event={"ID":"b9c9596c-6124-44d0-b06b-a99477938b79","Type":"ContainerStarted","Data":"ebb33749d4e4dad6fa211bddb69de895fd6053aac3712d7fc8155a6bf83ada93"} Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.321991 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c915fe-fb2a-4325-86c0-165aa90cf60f","Type":"ContainerStarted","Data":"b07b381180bc0ff47bb23b19d1309dceb471c7bbc5bca8ad21e62c5b085b0021"} Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.322892 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.323032 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.323984 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" event={"ID":"492fb01b-1889-47ff-b334-750b9614cc60","Type":"ContainerStarted","Data":"8e7b7a12f8642115be8d3f34bc4bbe5f55a60e52de1717cb6937e8322c7940ee"} Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.326634 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-567bd59758-rrcxt" event={"ID":"932dfe25-b5f6-4cc6-92f7-6594c8449263","Type":"ContainerStarted","Data":"09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1"} Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.326914 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.326943 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.358491 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-s5lzg" podStartSLOduration=3.478130679 podStartE2EDuration="55.358474247s" podCreationTimestamp="2026-02-17 15:11:16 +0000 UTC" firstStartedPulling="2026-02-17 15:11:17.682715343 +0000 UTC m=+1144.098555819" lastFinishedPulling="2026-02-17 15:12:09.563058911 +0000 UTC m=+1195.978899387" observedRunningTime="2026-02-17 15:12:11.337261195 +0000 UTC m=+1197.753101672" watchObservedRunningTime="2026-02-17 15:12:11.358474247 +0000 UTC m=+1197.774314713" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.366933 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.366912786 podStartE2EDuration="11.366912786s" podCreationTimestamp="2026-02-17 15:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:11.365905458 +0000 UTC m=+1197.781745954" watchObservedRunningTime="2026-02-17 15:12:11.366912786 +0000 UTC m=+1197.782753262" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.394072 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.433939 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.731049 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7977cbd67d-28z6x"] Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.785785 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-dc9f5cc6-vmvkc"] Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.791251 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.797546 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.797932 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.801047 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-dc9f5cc6-vmvkc"] Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.906118 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24915ee8-ac0d-4c45-826f-6ca18351c1fd-logs\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.906253 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-config-data\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.906289 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-config-data-custom\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.906576 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-combined-ca-bundle\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.906737 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-public-tls-certs\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.906931 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44c2m\" (UniqueName: \"kubernetes.io/projected/24915ee8-ac0d-4c45-826f-6ca18351c1fd-kube-api-access-44c2m\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.911952 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-internal-tls-certs\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.964827 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7fb4dbdc46-npdpn"] Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.966352 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:11 crc kubenswrapper[4717]: I0217 15:12:11.992916 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-68dcc9cf6f-nm8xp" podUID="ef7c0b4b-a1e5-4449-be56-928ccc11dbaf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: i/o timeout" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.015864 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24915ee8-ac0d-4c45-826f-6ca18351c1fd-logs\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.015920 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-scripts\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.015940 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-config-data\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.016181 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-config-data\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.016198 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-config-data-custom\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.016220 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-combined-ca-bundle\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.016239 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5xqw\" (UniqueName: \"kubernetes.io/projected/c1670724-02ac-4de2-b161-c2b78ecb9bb0-kube-api-access-z5xqw\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.016269 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1670724-02ac-4de2-b161-c2b78ecb9bb0-logs\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.016287 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-internal-tls-certs\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.016311 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-public-tls-certs\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.016327 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-public-tls-certs\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.016392 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44c2m\" (UniqueName: \"kubernetes.io/projected/24915ee8-ac0d-4c45-826f-6ca18351c1fd-kube-api-access-44c2m\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.016411 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-internal-tls-certs\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.016444 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-combined-ca-bundle\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.016845 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24915ee8-ac0d-4c45-826f-6ca18351c1fd-logs\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.020245 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7fb4dbdc46-npdpn"] Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.025642 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-config-data-custom\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.032792 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-combined-ca-bundle\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.051076 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-public-tls-certs\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.052165 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-config-data\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.060626 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24915ee8-ac0d-4c45-826f-6ca18351c1fd-internal-tls-certs\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.065733 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44c2m\" (UniqueName: \"kubernetes.io/projected/24915ee8-ac0d-4c45-826f-6ca18351c1fd-kube-api-access-44c2m\") pod \"barbican-api-dc9f5cc6-vmvkc\" (UID: \"24915ee8-ac0d-4c45-826f-6ca18351c1fd\") " pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.118352 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-scripts\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.118403 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-config-data\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.118443 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5xqw\" (UniqueName: \"kubernetes.io/projected/c1670724-02ac-4de2-b161-c2b78ecb9bb0-kube-api-access-z5xqw\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.118469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-internal-tls-certs\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.118486 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1670724-02ac-4de2-b161-c2b78ecb9bb0-logs\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.118511 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-public-tls-certs\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.119356 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-combined-ca-bundle\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.121345 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1670724-02ac-4de2-b161-c2b78ecb9bb0-logs\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.132687 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-config-data\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.132989 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-scripts\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.136882 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-combined-ca-bundle\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.138737 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-public-tls-certs\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.151187 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5xqw\" (UniqueName: \"kubernetes.io/projected/c1670724-02ac-4de2-b161-c2b78ecb9bb0-kube-api-access-z5xqw\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.156709 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1670724-02ac-4de2-b161-c2b78ecb9bb0-internal-tls-certs\") pod \"placement-7fb4dbdc46-npdpn\" (UID: \"c1670724-02ac-4de2-b161-c2b78ecb9bb0\") " pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.175186 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.219913 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.350577 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-567bd59758-rrcxt" event={"ID":"932dfe25-b5f6-4cc6-92f7-6594c8449263","Type":"ContainerStarted","Data":"2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0"} Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.351764 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.351791 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.366154 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7977cbd67d-28z6x" event={"ID":"db14c6ee-db55-4d0e-ace5-888795028d98","Type":"ContainerStarted","Data":"02e725f49c8e35a53dda500de67ee1aa36dae22b1627cb7aa9a638c093d9c883"} Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.366205 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7977cbd67d-28z6x" event={"ID":"db14c6ee-db55-4d0e-ace5-888795028d98","Type":"ContainerStarted","Data":"3ddd5f1a66da608a3be6c50bc83e584513f22476436c9d9fdc18a7c8c5b4e209"} Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.378118 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-567bd59758-rrcxt" podStartSLOduration=4.378102077 podStartE2EDuration="4.378102077s" podCreationTimestamp="2026-02-17 15:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:12.375182464 +0000 UTC m=+1198.791022950" watchObservedRunningTime="2026-02-17 15:12:12.378102077 +0000 UTC m=+1198.793942563" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.389388 4717 generic.go:334] "Generic (PLEG): container finished" podID="46311c1e-077b-4089-8a6e-c8497cd5b796" containerID="2099de09ac93e60fa3f259ec16f0d3700dad6773496ada71ff46ad3d902b4e30" exitCode=0 Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.390517 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" event={"ID":"46311c1e-077b-4089-8a6e-c8497cd5b796","Type":"ContainerDied","Data":"2099de09ac93e60fa3f259ec16f0d3700dad6773496ada71ff46ad3d902b4e30"} Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.417282 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-96f9cc575-vd9jv" event={"ID":"c8ee178e-3086-48e3-ad91-aefa00e0d10e","Type":"ContainerStarted","Data":"58bbcbba573a9b690b4f8468a826309a303dad97493e22e0b8a1dd016d862300"} Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.418387 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.460548 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-96f9cc575-vd9jv" podStartSLOduration=4.460534214 podStartE2EDuration="4.460534214s" podCreationTimestamp="2026-02-17 15:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:12.458156917 +0000 UTC m=+1198.873997403" watchObservedRunningTime="2026-02-17 15:12:12.460534214 +0000 UTC m=+1198.876374690" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.493682 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"528f2e450e72f62b3ad2eecc44f2dcf4715d7d675e6e5cd475f77cbdc51c59f6"} Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.493719 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"11ef50da0f4183670d7a0eb76be1a26a743eac67f97e7a7d45761efe2a5ba59c"} Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.493729 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"3a287f63faa7560c44c23a9eaf3c5b8cd59e3b12f5159790f052f212cfa2546d"} Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.505738 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff59556db-hl8w5" event={"ID":"f0360385-5ad6-4d9b-b825-1cae0c23d814","Type":"ContainerStarted","Data":"c4724f7dcdc72e5dac94a357887527f8daee7f38df85c48e6a901084055c22f9"} Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.506107 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff59556db-hl8w5" event={"ID":"f0360385-5ad6-4d9b-b825-1cae0c23d814","Type":"ContainerStarted","Data":"3838e90e0e40e4cbb19960a2004379528c4e2eaff1928a49b1253b50209ed9d3"} Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.506292 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.506382 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.573800 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7ff59556db-hl8w5" podStartSLOduration=4.573780605 podStartE2EDuration="4.573780605s" podCreationTimestamp="2026-02-17 15:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:12.54853844 +0000 UTC m=+1198.964378926" watchObservedRunningTime="2026-02-17 15:12:12.573780605 +0000 UTC m=+1198.989621081" Feb 17 15:12:12 crc kubenswrapper[4717]: I0217 15:12:12.975957 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-dc9f5cc6-vmvkc"] Feb 17 15:12:13 crc kubenswrapper[4717]: I0217 15:12:13.215577 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7fb4dbdc46-npdpn"] Feb 17 15:12:13 crc kubenswrapper[4717]: I0217 15:12:13.518891 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" event={"ID":"46311c1e-077b-4089-8a6e-c8497cd5b796","Type":"ContainerStarted","Data":"2ad436fd972abed4c75eb2082b7da4008e624a7e59edbb26b6682c4677f1820e"} Feb 17 15:12:13 crc kubenswrapper[4717]: I0217 15:12:13.519047 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:13 crc kubenswrapper[4717]: I0217 15:12:13.535682 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"25198d5cc3e56cc5aebcec984262cd8e61ecc75cdcdd440c001e5a8433c2440b"} Feb 17 15:12:13 crc kubenswrapper[4717]: I0217 15:12:13.535927 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7977cbd67d-28z6x" podUID="db14c6ee-db55-4d0e-ace5-888795028d98" containerName="barbican-api-log" containerID="cri-o://3ddd5f1a66da608a3be6c50bc83e584513f22476436c9d9fdc18a7c8c5b4e209" gracePeriod=30 Feb 17 15:12:13 crc kubenswrapper[4717]: I0217 15:12:13.536136 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7977cbd67d-28z6x" podUID="db14c6ee-db55-4d0e-ace5-888795028d98" containerName="barbican-api" containerID="cri-o://02e725f49c8e35a53dda500de67ee1aa36dae22b1627cb7aa9a638c093d9c883" gracePeriod=30 Feb 17 15:12:13 crc kubenswrapper[4717]: I0217 15:12:13.535947 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:13 crc kubenswrapper[4717]: I0217 15:12:13.536504 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:13 crc kubenswrapper[4717]: I0217 15:12:13.536522 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:13 crc kubenswrapper[4717]: I0217 15:12:13.548935 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" podStartSLOduration=5.548913144 podStartE2EDuration="5.548913144s" podCreationTimestamp="2026-02-17 15:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:13.543735337 +0000 UTC m=+1199.959575823" watchObservedRunningTime="2026-02-17 15:12:13.548913144 +0000 UTC m=+1199.964753620" Feb 17 15:12:13 crc kubenswrapper[4717]: I0217 15:12:13.574049 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7977cbd67d-28z6x" podStartSLOduration=4.574030446 podStartE2EDuration="4.574030446s" podCreationTimestamp="2026-02-17 15:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:13.568363215 +0000 UTC m=+1199.984203691" watchObservedRunningTime="2026-02-17 15:12:13.574030446 +0000 UTC m=+1199.989870922" Feb 17 15:12:14 crc kubenswrapper[4717]: W0217 15:12:14.146152 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1670724_02ac_4de2_b161_c2b78ecb9bb0.slice/crio-01a1820f9d3b299ac5dda689e70fa5868da918a4b52f437b9746324c9218fe91 WatchSource:0}: Error finding container 01a1820f9d3b299ac5dda689e70fa5868da918a4b52f437b9746324c9218fe91: Status 404 returned error can't find the container with id 01a1820f9d3b299ac5dda689e70fa5868da918a4b52f437b9746324c9218fe91 Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.541713 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.589769 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dc9f5cc6-vmvkc" event={"ID":"24915ee8-ac0d-4c45-826f-6ca18351c1fd","Type":"ContainerStarted","Data":"df0322c4cf29eb3f0219215aa3d3e6d3cf0a6e5eec8930cf0238b2893c30ebba"} Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.599275 4717 generic.go:334] "Generic (PLEG): container finished" podID="db14c6ee-db55-4d0e-ace5-888795028d98" containerID="02e725f49c8e35a53dda500de67ee1aa36dae22b1627cb7aa9a638c093d9c883" exitCode=0 Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.599308 4717 generic.go:334] "Generic (PLEG): container finished" podID="db14c6ee-db55-4d0e-ace5-888795028d98" containerID="3ddd5f1a66da608a3be6c50bc83e584513f22476436c9d9fdc18a7c8c5b4e209" exitCode=143 Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.603315 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7977cbd67d-28z6x" event={"ID":"db14c6ee-db55-4d0e-ace5-888795028d98","Type":"ContainerDied","Data":"02e725f49c8e35a53dda500de67ee1aa36dae22b1627cb7aa9a638c093d9c883"} Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.603368 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7977cbd67d-28z6x" event={"ID":"db14c6ee-db55-4d0e-ace5-888795028d98","Type":"ContainerDied","Data":"3ddd5f1a66da608a3be6c50bc83e584513f22476436c9d9fdc18a7c8c5b4e209"} Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.636789 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fb4dbdc46-npdpn" event={"ID":"c1670724-02ac-4de2-b161-c2b78ecb9bb0","Type":"ContainerStarted","Data":"01a1820f9d3b299ac5dda689e70fa5868da918a4b52f437b9746324c9218fe91"} Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.734784 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.903975 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-config-data\") pod \"db14c6ee-db55-4d0e-ace5-888795028d98\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.904163 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db14c6ee-db55-4d0e-ace5-888795028d98-logs\") pod \"db14c6ee-db55-4d0e-ace5-888795028d98\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.904223 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sjrn\" (UniqueName: \"kubernetes.io/projected/db14c6ee-db55-4d0e-ace5-888795028d98-kube-api-access-5sjrn\") pod \"db14c6ee-db55-4d0e-ace5-888795028d98\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.904242 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-combined-ca-bundle\") pod \"db14c6ee-db55-4d0e-ace5-888795028d98\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.904361 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-config-data-custom\") pod \"db14c6ee-db55-4d0e-ace5-888795028d98\" (UID: \"db14c6ee-db55-4d0e-ace5-888795028d98\") " Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.905299 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db14c6ee-db55-4d0e-ace5-888795028d98-logs" (OuterVolumeSpecName: "logs") pod "db14c6ee-db55-4d0e-ace5-888795028d98" (UID: "db14c6ee-db55-4d0e-ace5-888795028d98"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.909207 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "db14c6ee-db55-4d0e-ace5-888795028d98" (UID: "db14c6ee-db55-4d0e-ace5-888795028d98"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.910236 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db14c6ee-db55-4d0e-ace5-888795028d98-kube-api-access-5sjrn" (OuterVolumeSpecName: "kube-api-access-5sjrn") pod "db14c6ee-db55-4d0e-ace5-888795028d98" (UID: "db14c6ee-db55-4d0e-ace5-888795028d98"). InnerVolumeSpecName "kube-api-access-5sjrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:14 crc kubenswrapper[4717]: I0217 15:12:14.982171 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db14c6ee-db55-4d0e-ace5-888795028d98" (UID: "db14c6ee-db55-4d0e-ace5-888795028d98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.010524 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.010554 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db14c6ee-db55-4d0e-ace5-888795028d98-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.010565 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sjrn\" (UniqueName: \"kubernetes.io/projected/db14c6ee-db55-4d0e-ace5-888795028d98-kube-api-access-5sjrn\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.010576 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.026288 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b88fd5cc6-dqjmc" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.034717 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-config-data" (OuterVolumeSpecName: "config-data") pod "db14c6ee-db55-4d0e-ace5-888795028d98" (UID: "db14c6ee-db55-4d0e-ace5-888795028d98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.089489 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-85b46995b-rj5bq" podUID="04bb64de-6640-4f6a-9052-ff0edf9dacb8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.114470 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db14c6ee-db55-4d0e-ace5-888795028d98-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.405704 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.428411 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.657360 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-96ddcd589-gq7qg" event={"ID":"0b3b8192-afd0-44c1-886f-7dc072112460","Type":"ContainerStarted","Data":"7bb94e100af95a0ec39f60e9544d90c736124e12e47685ff6fc5a5c36e992b10"} Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.657405 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-96ddcd589-gq7qg" event={"ID":"0b3b8192-afd0-44c1-886f-7dc072112460","Type":"ContainerStarted","Data":"22cb7f63e0b2d6626c9febf15e52dda3917cefe17775058ccac2e1f955e75ce5"} Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.669281 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" event={"ID":"914980ae-b392-46a5-bf14-0a80569736f1","Type":"ContainerStarted","Data":"391415c6ff38983bcc46be63680ab336e94d01c42209ece053b124e57131cce7"} Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.669333 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" event={"ID":"914980ae-b392-46a5-bf14-0a80569736f1","Type":"ContainerStarted","Data":"9a94cf52f360a961c2d2d73ebe452ee9cbd065ac00089c98ebdd0379dbefc7a2"} Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.696785 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fb4dbdc46-npdpn" event={"ID":"c1670724-02ac-4de2-b161-c2b78ecb9bb0","Type":"ContainerStarted","Data":"d803f97dd69ca78567ef2c013b0fb3175647eb1036eeb65bc5b9abdf1e9f8bb3"} Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.696839 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fb4dbdc46-npdpn" event={"ID":"c1670724-02ac-4de2-b161-c2b78ecb9bb0","Type":"ContainerStarted","Data":"d794d032ba938dad37a10d522442685e1e3c716da679e920b9ae32909648044e"} Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.697266 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.697373 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.703209 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-96ddcd589-gq7qg" podStartSLOduration=4.354758586 podStartE2EDuration="7.703192045s" podCreationTimestamp="2026-02-17 15:12:08 +0000 UTC" firstStartedPulling="2026-02-17 15:12:10.962818759 +0000 UTC m=+1197.378659235" lastFinishedPulling="2026-02-17 15:12:14.311252218 +0000 UTC m=+1200.727092694" observedRunningTime="2026-02-17 15:12:15.688507698 +0000 UTC m=+1202.104348174" watchObservedRunningTime="2026-02-17 15:12:15.703192045 +0000 UTC m=+1202.119032521" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.723464 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dc9f5cc6-vmvkc" event={"ID":"24915ee8-ac0d-4c45-826f-6ca18351c1fd","Type":"ContainerStarted","Data":"ac59ca0b23d45f76015926c31ac8dc1d2a00490aac5abec7a3351f9afe4f0318"} Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.723511 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dc9f5cc6-vmvkc" event={"ID":"24915ee8-ac0d-4c45-826f-6ca18351c1fd","Type":"ContainerStarted","Data":"9a4f9b0acacff94cbb500d28e01d8281a225b8372fd0b9fb3faca3fce32ae871"} Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.724372 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.724402 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.726189 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" podStartSLOduration=4.406544474 podStartE2EDuration="7.726162076s" podCreationTimestamp="2026-02-17 15:12:08 +0000 UTC" firstStartedPulling="2026-02-17 15:12:10.963419506 +0000 UTC m=+1197.379259972" lastFinishedPulling="2026-02-17 15:12:14.283037098 +0000 UTC m=+1200.698877574" observedRunningTime="2026-02-17 15:12:15.710535753 +0000 UTC m=+1202.126376239" watchObservedRunningTime="2026-02-17 15:12:15.726162076 +0000 UTC m=+1202.142002552" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.764854 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" event={"ID":"492fb01b-1889-47ff-b334-750b9614cc60","Type":"ContainerStarted","Data":"545ecec0d026fba5567b3f72f45c8b9e4db47b521a03d6348f627cfb43ba1801"} Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.764896 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" event={"ID":"492fb01b-1889-47ff-b334-750b9614cc60","Type":"ContainerStarted","Data":"0d5336556b0255ad36708372e64086c89c46167c1314c601c2a4720005dcb894"} Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.779337 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6b7b6cb599-wtwlr"] Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.781044 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7fb4dbdc46-npdpn" podStartSLOduration=4.781026722 podStartE2EDuration="4.781026722s" podCreationTimestamp="2026-02-17 15:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:15.739652619 +0000 UTC m=+1202.155493095" watchObservedRunningTime="2026-02-17 15:12:15.781026722 +0000 UTC m=+1202.196867198" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.788347 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7977cbd67d-28z6x" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.791147 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7977cbd67d-28z6x" event={"ID":"db14c6ee-db55-4d0e-ace5-888795028d98","Type":"ContainerDied","Data":"ccf86e9aefee981da8b75b01c365cca800c9ed59e713b7b3acbf2db3b3585c7b"} Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.791193 4717 scope.go:117] "RemoveContainer" containerID="02e725f49c8e35a53dda500de67ee1aa36dae22b1627cb7aa9a638c093d9c883" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.818137 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-dc9f5cc6-vmvkc" podStartSLOduration=4.818119273 podStartE2EDuration="4.818119273s" podCreationTimestamp="2026-02-17 15:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:15.766094308 +0000 UTC m=+1202.181934794" watchObservedRunningTime="2026-02-17 15:12:15.818119273 +0000 UTC m=+1202.233959739" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.828418 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-678d9d64d6-sj9md" podStartSLOduration=4.4982910050000005 podStartE2EDuration="7.828397945s" podCreationTimestamp="2026-02-17 15:12:08 +0000 UTC" firstStartedPulling="2026-02-17 15:12:10.955247674 +0000 UTC m=+1197.371088150" lastFinishedPulling="2026-02-17 15:12:14.285354614 +0000 UTC m=+1200.701195090" observedRunningTime="2026-02-17 15:12:15.78802136 +0000 UTC m=+1202.203861836" watchObservedRunningTime="2026-02-17 15:12:15.828397945 +0000 UTC m=+1202.244238421" Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.840522 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b587759fd-cw959" event={"ID":"bb4300fb-9575-47b9-89a3-ca587932319f","Type":"ContainerStarted","Data":"0125b6b630f7d917f4a478598b719c991ee85eb9dea460298b089bb1b90940a4"} Feb 17 15:12:15 crc kubenswrapper[4717]: I0217 15:12:15.840576 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b587759fd-cw959" event={"ID":"bb4300fb-9575-47b9-89a3-ca587932319f","Type":"ContainerStarted","Data":"d45680eb11560abe16fbe89f3ffc9f227762fcc0c00fee5c80af39723a48201d"} Feb 17 15:12:16 crc kubenswrapper[4717]: I0217 15:12:15.999974 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-b587759fd-cw959"] Feb 17 15:12:16 crc kubenswrapper[4717]: I0217 15:12:16.011321 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7977cbd67d-28z6x"] Feb 17 15:12:16 crc kubenswrapper[4717]: I0217 15:12:16.023253 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7977cbd67d-28z6x"] Feb 17 15:12:16 crc kubenswrapper[4717]: I0217 15:12:16.026419 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 15:12:16 crc kubenswrapper[4717]: I0217 15:12:16.027850 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-b587759fd-cw959" podStartSLOduration=3.582450328 podStartE2EDuration="8.027832859s" podCreationTimestamp="2026-02-17 15:12:08 +0000 UTC" firstStartedPulling="2026-02-17 15:12:09.839839259 +0000 UTC m=+1196.255679735" lastFinishedPulling="2026-02-17 15:12:14.28522179 +0000 UTC m=+1200.701062266" observedRunningTime="2026-02-17 15:12:15.956443025 +0000 UTC m=+1202.372283501" watchObservedRunningTime="2026-02-17 15:12:16.027832859 +0000 UTC m=+1202.443673335" Feb 17 15:12:16 crc kubenswrapper[4717]: I0217 15:12:16.688315 4717 scope.go:117] "RemoveContainer" containerID="3ddd5f1a66da608a3be6c50bc83e584513f22476436c9d9fdc18a7c8c5b4e209" Feb 17 15:12:17 crc kubenswrapper[4717]: I0217 15:12:17.858778 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db14c6ee-db55-4d0e-ace5-888795028d98" path="/var/lib/kubelet/pods/db14c6ee-db55-4d0e-ace5-888795028d98/volumes" Feb 17 15:12:17 crc kubenswrapper[4717]: I0217 15:12:17.861449 4717 generic.go:334] "Generic (PLEG): container finished" podID="b9c9596c-6124-44d0-b06b-a99477938b79" containerID="ebb33749d4e4dad6fa211bddb69de895fd6053aac3712d7fc8155a6bf83ada93" exitCode=0 Feb 17 15:12:17 crc kubenswrapper[4717]: I0217 15:12:17.861556 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s5lzg" event={"ID":"b9c9596c-6124-44d0-b06b-a99477938b79","Type":"ContainerDied","Data":"ebb33749d4e4dad6fa211bddb69de895fd6053aac3712d7fc8155a6bf83ada93"} Feb 17 15:12:17 crc kubenswrapper[4717]: I0217 15:12:17.880883 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"4989f545a93f5522021cd7f0c63269eac031cf639dd27e90e503b8efc17c9869"} Feb 17 15:12:17 crc kubenswrapper[4717]: I0217 15:12:17.880949 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"eab0fec9fe39c37d06d76fbeb352c9755d844e3cdd187857a557f79434d88f5d"} Feb 17 15:12:17 crc kubenswrapper[4717]: I0217 15:12:17.880960 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"78c1d6ac25551edf0419d0028a876d2d941a27f0f5d6a13b5b191bd86d87ba31"} Feb 17 15:12:17 crc kubenswrapper[4717]: I0217 15:12:17.881131 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-b587759fd-cw959" podUID="bb4300fb-9575-47b9-89a3-ca587932319f" containerName="barbican-keystone-listener-log" containerID="cri-o://d45680eb11560abe16fbe89f3ffc9f227762fcc0c00fee5c80af39723a48201d" gracePeriod=30 Feb 17 15:12:17 crc kubenswrapper[4717]: I0217 15:12:17.881251 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-b587759fd-cw959" podUID="bb4300fb-9575-47b9-89a3-ca587932319f" containerName="barbican-keystone-listener" containerID="cri-o://0125b6b630f7d917f4a478598b719c991ee85eb9dea460298b089bb1b90940a4" gracePeriod=30 Feb 17 15:12:17 crc kubenswrapper[4717]: I0217 15:12:17.881565 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" podUID="914980ae-b392-46a5-bf14-0a80569736f1" containerName="barbican-worker" containerID="cri-o://391415c6ff38983bcc46be63680ab336e94d01c42209ece053b124e57131cce7" gracePeriod=30 Feb 17 15:12:17 crc kubenswrapper[4717]: I0217 15:12:17.883021 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" podUID="914980ae-b392-46a5-bf14-0a80569736f1" containerName="barbican-worker-log" containerID="cri-o://9a94cf52f360a961c2d2d73ebe452ee9cbd065ac00089c98ebdd0379dbefc7a2" gracePeriod=30 Feb 17 15:12:18 crc kubenswrapper[4717]: I0217 15:12:18.197049 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 15:12:18 crc kubenswrapper[4717]: I0217 15:12:18.898154 4717 generic.go:334] "Generic (PLEG): container finished" podID="bb4300fb-9575-47b9-89a3-ca587932319f" containerID="0125b6b630f7d917f4a478598b719c991ee85eb9dea460298b089bb1b90940a4" exitCode=0 Feb 17 15:12:18 crc kubenswrapper[4717]: I0217 15:12:18.898182 4717 generic.go:334] "Generic (PLEG): container finished" podID="bb4300fb-9575-47b9-89a3-ca587932319f" containerID="d45680eb11560abe16fbe89f3ffc9f227762fcc0c00fee5c80af39723a48201d" exitCode=143 Feb 17 15:12:18 crc kubenswrapper[4717]: I0217 15:12:18.898228 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b587759fd-cw959" event={"ID":"bb4300fb-9575-47b9-89a3-ca587932319f","Type":"ContainerDied","Data":"0125b6b630f7d917f4a478598b719c991ee85eb9dea460298b089bb1b90940a4"} Feb 17 15:12:18 crc kubenswrapper[4717]: I0217 15:12:18.898304 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b587759fd-cw959" event={"ID":"bb4300fb-9575-47b9-89a3-ca587932319f","Type":"ContainerDied","Data":"d45680eb11560abe16fbe89f3ffc9f227762fcc0c00fee5c80af39723a48201d"} Feb 17 15:12:18 crc kubenswrapper[4717]: I0217 15:12:18.901100 4717 generic.go:334] "Generic (PLEG): container finished" podID="914980ae-b392-46a5-bf14-0a80569736f1" containerID="391415c6ff38983bcc46be63680ab336e94d01c42209ece053b124e57131cce7" exitCode=0 Feb 17 15:12:18 crc kubenswrapper[4717]: I0217 15:12:18.901144 4717 generic.go:334] "Generic (PLEG): container finished" podID="914980ae-b392-46a5-bf14-0a80569736f1" containerID="9a94cf52f360a961c2d2d73ebe452ee9cbd065ac00089c98ebdd0379dbefc7a2" exitCode=143 Feb 17 15:12:18 crc kubenswrapper[4717]: I0217 15:12:18.901130 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" event={"ID":"914980ae-b392-46a5-bf14-0a80569736f1","Type":"ContainerDied","Data":"391415c6ff38983bcc46be63680ab336e94d01c42209ece053b124e57131cce7"} Feb 17 15:12:18 crc kubenswrapper[4717]: I0217 15:12:18.901352 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" event={"ID":"914980ae-b392-46a5-bf14-0a80569736f1","Type":"ContainerDied","Data":"9a94cf52f360a961c2d2d73ebe452ee9cbd065ac00089c98ebdd0379dbefc7a2"} Feb 17 15:12:19 crc kubenswrapper[4717]: I0217 15:12:19.330397 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:19 crc kubenswrapper[4717]: I0217 15:12:19.399565 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-blhdn"] Feb 17 15:12:19 crc kubenswrapper[4717]: I0217 15:12:19.399840 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fb745b69-blhdn" podUID="d5afdf52-2ad7-4957-8bff-ca95fee13432" containerName="dnsmasq-dns" containerID="cri-o://915b734f054a00c83e2030292e96a88928e5a19cd6fb858daa06eab626edeb61" gracePeriod=10 Feb 17 15:12:19 crc kubenswrapper[4717]: I0217 15:12:19.931241 4717 generic.go:334] "Generic (PLEG): container finished" podID="d5afdf52-2ad7-4957-8bff-ca95fee13432" containerID="915b734f054a00c83e2030292e96a88928e5a19cd6fb858daa06eab626edeb61" exitCode=0 Feb 17 15:12:19 crc kubenswrapper[4717]: I0217 15:12:19.931290 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-blhdn" event={"ID":"d5afdf52-2ad7-4957-8bff-ca95fee13432","Type":"ContainerDied","Data":"915b734f054a00c83e2030292e96a88928e5a19cd6fb858daa06eab626edeb61"} Feb 17 15:12:20 crc kubenswrapper[4717]: I0217 15:12:20.808587 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:12:20 crc kubenswrapper[4717]: I0217 15:12:20.808914 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:12:20 crc kubenswrapper[4717]: I0217 15:12:20.994654 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:21 crc kubenswrapper[4717]: I0217 15:12:21.349710 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.218489 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-fb745b69-blhdn" podUID="d5afdf52-2ad7-4957-8bff-ca95fee13432" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.414672 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.492859 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.635569 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c9596c-6124-44d0-b06b-a99477938b79-etc-machine-id\") pod \"b9c9596c-6124-44d0-b06b-a99477938b79\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.635931 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-scripts\") pod \"b9c9596c-6124-44d0-b06b-a99477938b79\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.635967 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-combined-ca-bundle\") pod \"b9c9596c-6124-44d0-b06b-a99477938b79\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.635660 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9c9596c-6124-44d0-b06b-a99477938b79-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b9c9596c-6124-44d0-b06b-a99477938b79" (UID: "b9c9596c-6124-44d0-b06b-a99477938b79"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.636073 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-config-data\") pod \"b9c9596c-6124-44d0-b06b-a99477938b79\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.636138 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-db-sync-config-data\") pod \"b9c9596c-6124-44d0-b06b-a99477938b79\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.636169 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2klgj\" (UniqueName: \"kubernetes.io/projected/b9c9596c-6124-44d0-b06b-a99477938b79-kube-api-access-2klgj\") pod \"b9c9596c-6124-44d0-b06b-a99477938b79\" (UID: \"b9c9596c-6124-44d0-b06b-a99477938b79\") " Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.636731 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c9596c-6124-44d0-b06b-a99477938b79-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.644887 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c9596c-6124-44d0-b06b-a99477938b79-kube-api-access-2klgj" (OuterVolumeSpecName: "kube-api-access-2klgj") pod "b9c9596c-6124-44d0-b06b-a99477938b79" (UID: "b9c9596c-6124-44d0-b06b-a99477938b79"). InnerVolumeSpecName "kube-api-access-2klgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.659396 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-scripts" (OuterVolumeSpecName: "scripts") pod "b9c9596c-6124-44d0-b06b-a99477938b79" (UID: "b9c9596c-6124-44d0-b06b-a99477938b79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.680238 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b9c9596c-6124-44d0-b06b-a99477938b79" (UID: "b9c9596c-6124-44d0-b06b-a99477938b79"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.687176 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9c9596c-6124-44d0-b06b-a99477938b79" (UID: "b9c9596c-6124-44d0-b06b-a99477938b79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.705845 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.716449 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-config-data" (OuterVolumeSpecName: "config-data") pod "b9c9596c-6124-44d0-b06b-a99477938b79" (UID: "b9c9596c-6124-44d0-b06b-a99477938b79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.726132 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c77f44b65-wwnph"] Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.728365 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-c77f44b65-wwnph" podUID="4e706234-18f2-467c-8681-79402d25eb1c" containerName="neutron-httpd" containerID="cri-o://5b321fba023192bfa1cfb97df9238f2d1cd260a5dfa1a7792d062cbe0d32c773" gracePeriod=30 Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.728315 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-c77f44b65-wwnph" podUID="4e706234-18f2-467c-8681-79402d25eb1c" containerName="neutron-api" containerID="cri-o://9b0e8901f70ef1717d2e1dc0e35c08ede9faad7674cd112478e5224ef0aef980" gracePeriod=30 Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.743569 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.743601 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.743611 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2klgj\" (UniqueName: \"kubernetes.io/projected/b9c9596c-6124-44d0-b06b-a99477938b79-kube-api-access-2klgj\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.743622 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.743631 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c9596c-6124-44d0-b06b-a99477938b79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.763694 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.773923 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b696c957f-jwh8l"] Feb 17 15:12:23 crc kubenswrapper[4717]: E0217 15:12:23.774306 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db14c6ee-db55-4d0e-ace5-888795028d98" containerName="barbican-api-log" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.774319 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="db14c6ee-db55-4d0e-ace5-888795028d98" containerName="barbican-api-log" Feb 17 15:12:23 crc kubenswrapper[4717]: E0217 15:12:23.774335 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db14c6ee-db55-4d0e-ace5-888795028d98" containerName="barbican-api" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.774341 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="db14c6ee-db55-4d0e-ace5-888795028d98" containerName="barbican-api" Feb 17 15:12:23 crc kubenswrapper[4717]: E0217 15:12:23.774353 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c9596c-6124-44d0-b06b-a99477938b79" containerName="cinder-db-sync" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.774359 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c9596c-6124-44d0-b06b-a99477938b79" containerName="cinder-db-sync" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.774591 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9c9596c-6124-44d0-b06b-a99477938b79" containerName="cinder-db-sync" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.774624 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="db14c6ee-db55-4d0e-ace5-888795028d98" containerName="barbican-api" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.774635 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="db14c6ee-db55-4d0e-ace5-888795028d98" containerName="barbican-api-log" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.777057 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.802884 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b696c957f-jwh8l"] Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.948786 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-internal-tls-certs\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.948873 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n74x\" (UniqueName: \"kubernetes.io/projected/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-kube-api-access-7n74x\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.948956 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-ovndb-tls-certs\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.948997 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-combined-ca-bundle\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.949025 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-config\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.949061 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-public-tls-certs\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.949122 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-httpd-config\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.967363 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s5lzg" event={"ID":"b9c9596c-6124-44d0-b06b-a99477938b79","Type":"ContainerDied","Data":"bb62fc292e03df68e9c8f52e8a06201e030456044052f9076e355b856862eddc"} Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.967599 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb62fc292e03df68e9c8f52e8a06201e030456044052f9076e355b856862eddc" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.967574 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s5lzg" Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.969879 4717 generic.go:334] "Generic (PLEG): container finished" podID="4e706234-18f2-467c-8681-79402d25eb1c" containerID="5b321fba023192bfa1cfb97df9238f2d1cd260a5dfa1a7792d062cbe0d32c773" exitCode=0 Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.969909 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c77f44b65-wwnph" event={"ID":"4e706234-18f2-467c-8681-79402d25eb1c","Type":"ContainerDied","Data":"5b321fba023192bfa1cfb97df9238f2d1cd260a5dfa1a7792d062cbe0d32c773"} Feb 17 15:12:23 crc kubenswrapper[4717]: I0217 15:12:23.983034 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-dc9f5cc6-vmvkc" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.036293 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7ff59556db-hl8w5"] Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.036507 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7ff59556db-hl8w5" podUID="f0360385-5ad6-4d9b-b825-1cae0c23d814" containerName="barbican-api-log" containerID="cri-o://3838e90e0e40e4cbb19960a2004379528c4e2eaff1928a49b1253b50209ed9d3" gracePeriod=30 Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.036895 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7ff59556db-hl8w5" podUID="f0360385-5ad6-4d9b-b825-1cae0c23d814" containerName="barbican-api" containerID="cri-o://c4724f7dcdc72e5dac94a357887527f8daee7f38df85c48e6a901084055c22f9" gracePeriod=30 Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.052293 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-combined-ca-bundle\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.052367 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-config\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.052397 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-public-tls-certs\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.052424 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-httpd-config\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.052502 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-internal-tls-certs\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.052549 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n74x\" (UniqueName: \"kubernetes.io/projected/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-kube-api-access-7n74x\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.052589 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-ovndb-tls-certs\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.059785 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-httpd-config\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.060324 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-internal-tls-certs\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.062472 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-combined-ca-bundle\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.064823 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-config\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.065149 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-ovndb-tls-certs\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.079736 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n74x\" (UniqueName: \"kubernetes.io/projected/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-kube-api-access-7n74x\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.079942 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5fc5d05-07ba-476e-835e-61dfa2e9edc1-public-tls-certs\") pod \"neutron-b696c957f-jwh8l\" (UID: \"d5fc5d05-07ba-476e-835e-61dfa2e9edc1\") " pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.105979 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.788675 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.790165 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.796832 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bzltb" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.797031 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.797159 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.797310 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.815227 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.866444 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.866508 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-scripts\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.866571 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l6v6\" (UniqueName: \"kubernetes.io/projected/55a43ce8-bd32-4179-a265-04112693da30-kube-api-access-5l6v6\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.866590 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-config-data\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.866609 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55a43ce8-bd32-4179-a265-04112693da30-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.866700 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.891815 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57fff66767-xkhmf"] Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.898804 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.918912 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57fff66767-xkhmf"] Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.972118 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55a43ce8-bd32-4179-a265-04112693da30-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.972226 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55a43ce8-bd32-4179-a265-04112693da30-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.972276 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrtl8\" (UniqueName: \"kubernetes.io/projected/8de54124-8e8d-4a21-b058-d1007d78d6ce-kube-api-access-jrtl8\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.972419 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-config\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.972455 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-dns-svc\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.972787 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.972863 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-ovsdbserver-sb\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.972912 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.972952 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-ovsdbserver-nb\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.972993 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-scripts\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.973196 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l6v6\" (UniqueName: \"kubernetes.io/projected/55a43ce8-bd32-4179-a265-04112693da30-kube-api-access-5l6v6\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.973221 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-config-data\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.978666 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.984339 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-scripts\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.985595 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.986408 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-config-data\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.989729 4717 generic.go:334] "Generic (PLEG): container finished" podID="f0360385-5ad6-4d9b-b825-1cae0c23d814" containerID="3838e90e0e40e4cbb19960a2004379528c4e2eaff1928a49b1253b50209ed9d3" exitCode=143 Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.989789 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff59556db-hl8w5" event={"ID":"f0360385-5ad6-4d9b-b825-1cae0c23d814","Type":"ContainerDied","Data":"3838e90e0e40e4cbb19960a2004379528c4e2eaff1928a49b1253b50209ed9d3"} Feb 17 15:12:24 crc kubenswrapper[4717]: I0217 15:12:24.990496 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l6v6\" (UniqueName: \"kubernetes.io/projected/55a43ce8-bd32-4179-a265-04112693da30-kube-api-access-5l6v6\") pod \"cinder-scheduler-0\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.116397 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-dns-svc\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.116795 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-ovsdbserver-sb\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.116867 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-ovsdbserver-nb\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.117036 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrtl8\" (UniqueName: \"kubernetes.io/projected/8de54124-8e8d-4a21-b058-d1007d78d6ce-kube-api-access-jrtl8\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.117107 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-config\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.117510 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.118905 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-dns-svc\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.119455 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-ovsdbserver-sb\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.119968 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-ovsdbserver-nb\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.121329 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-config\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.145755 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrtl8\" (UniqueName: \"kubernetes.io/projected/8de54124-8e8d-4a21-b058-d1007d78d6ce-kube-api-access-jrtl8\") pod \"dnsmasq-dns-57fff66767-xkhmf\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.149203 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.154558 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.159213 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.238483 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.242283 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-scripts\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.242352 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-config-data-custom\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.242387 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8363289-fd40-402e-a82d-2d7954bdca28-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.242428 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8363289-fd40-402e-a82d-2d7954bdca28-logs\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.242445 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-config-data\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.242513 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.242539 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlsfh\" (UniqueName: \"kubernetes.io/projected/d8363289-fd40-402e-a82d-2d7954bdca28-kube-api-access-dlsfh\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.246911 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.348311 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.348377 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlsfh\" (UniqueName: \"kubernetes.io/projected/d8363289-fd40-402e-a82d-2d7954bdca28-kube-api-access-dlsfh\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.348452 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-scripts\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.348497 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-config-data-custom\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.348524 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8363289-fd40-402e-a82d-2d7954bdca28-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.348558 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8363289-fd40-402e-a82d-2d7954bdca28-logs\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.348575 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-config-data\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.349120 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8363289-fd40-402e-a82d-2d7954bdca28-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.349541 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8363289-fd40-402e-a82d-2d7954bdca28-logs\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.354303 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-config-data\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.356681 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-scripts\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.356963 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.371136 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlsfh\" (UniqueName: \"kubernetes.io/projected/d8363289-fd40-402e-a82d-2d7954bdca28-kube-api-access-dlsfh\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.387821 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-config-data-custom\") pod \"cinder-api-0\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " pod="openstack/cinder-api-0" Feb 17 15:12:25 crc kubenswrapper[4717]: I0217 15:12:25.563648 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 15:12:26 crc kubenswrapper[4717]: I0217 15:12:26.133246 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-c77f44b65-wwnph" podUID="4e706234-18f2-467c-8681-79402d25eb1c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9696/\": dial tcp 10.217.0.151:9696: connect: connection refused" Feb 17 15:12:26 crc kubenswrapper[4717]: I0217 15:12:26.767141 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.182784 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.226543 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7ff59556db-hl8w5" podUID="f0360385-5ad6-4d9b-b825-1cae0c23d814" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:56048->10.217.0.158:9311: read: connection reset by peer" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.226573 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7ff59556db-hl8w5" podUID="f0360385-5ad6-4d9b-b825-1cae0c23d814" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:56038->10.217.0.158:9311: read: connection reset by peer" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.338281 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.502404 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.539968 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:12:27 crc kubenswrapper[4717]: E0217 15:12:27.543823 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Feb 17 15:12:27 crc kubenswrapper[4717]: E0217 15:12:27.544120 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v6q6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 15:12:27 crc kubenswrapper[4717]: E0217 15:12:27.546053 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.601844 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-combined-ca-bundle\") pod \"914980ae-b392-46a5-bf14-0a80569736f1\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.601895 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlhkr\" (UniqueName: \"kubernetes.io/projected/914980ae-b392-46a5-bf14-0a80569736f1-kube-api-access-rlhkr\") pod \"914980ae-b392-46a5-bf14-0a80569736f1\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.601959 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-config-data\") pod \"914980ae-b392-46a5-bf14-0a80569736f1\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.602062 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/914980ae-b392-46a5-bf14-0a80569736f1-logs\") pod \"914980ae-b392-46a5-bf14-0a80569736f1\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.602134 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-config-data-custom\") pod \"914980ae-b392-46a5-bf14-0a80569736f1\" (UID: \"914980ae-b392-46a5-bf14-0a80569736f1\") " Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.613140 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "914980ae-b392-46a5-bf14-0a80569736f1" (UID: "914980ae-b392-46a5-bf14-0a80569736f1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.623038 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/914980ae-b392-46a5-bf14-0a80569736f1-logs" (OuterVolumeSpecName: "logs") pod "914980ae-b392-46a5-bf14-0a80569736f1" (UID: "914980ae-b392-46a5-bf14-0a80569736f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.630402 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/914980ae-b392-46a5-bf14-0a80569736f1-kube-api-access-rlhkr" (OuterVolumeSpecName: "kube-api-access-rlhkr") pod "914980ae-b392-46a5-bf14-0a80569736f1" (UID: "914980ae-b392-46a5-bf14-0a80569736f1"). InnerVolumeSpecName "kube-api-access-rlhkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.664607 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "914980ae-b392-46a5-bf14-0a80569736f1" (UID: "914980ae-b392-46a5-bf14-0a80569736f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.706907 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-ovsdbserver-sb\") pod \"d5afdf52-2ad7-4957-8bff-ca95fee13432\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.707351 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-ovsdbserver-nb\") pod \"d5afdf52-2ad7-4957-8bff-ca95fee13432\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.707487 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-dns-svc\") pod \"d5afdf52-2ad7-4957-8bff-ca95fee13432\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.707573 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kblvc\" (UniqueName: \"kubernetes.io/projected/d5afdf52-2ad7-4957-8bff-ca95fee13432-kube-api-access-kblvc\") pod \"d5afdf52-2ad7-4957-8bff-ca95fee13432\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.707590 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-config\") pod \"d5afdf52-2ad7-4957-8bff-ca95fee13432\" (UID: \"d5afdf52-2ad7-4957-8bff-ca95fee13432\") " Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.707977 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/914980ae-b392-46a5-bf14-0a80569736f1-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.707991 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.708001 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.708011 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlhkr\" (UniqueName: \"kubernetes.io/projected/914980ae-b392-46a5-bf14-0a80569736f1-kube-api-access-rlhkr\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.721615 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5afdf52-2ad7-4957-8bff-ca95fee13432-kube-api-access-kblvc" (OuterVolumeSpecName: "kube-api-access-kblvc") pod "d5afdf52-2ad7-4957-8bff-ca95fee13432" (UID: "d5afdf52-2ad7-4957-8bff-ca95fee13432"). InnerVolumeSpecName "kube-api-access-kblvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.736948 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-config-data" (OuterVolumeSpecName: "config-data") pod "914980ae-b392-46a5-bf14-0a80569736f1" (UID: "914980ae-b392-46a5-bf14-0a80569736f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.788903 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-config" (OuterVolumeSpecName: "config") pod "d5afdf52-2ad7-4957-8bff-ca95fee13432" (UID: "d5afdf52-2ad7-4957-8bff-ca95fee13432"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.811490 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kblvc\" (UniqueName: \"kubernetes.io/projected/d5afdf52-2ad7-4957-8bff-ca95fee13432-kube-api-access-kblvc\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.811527 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.811537 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/914980ae-b392-46a5-bf14-0a80569736f1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.814842 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5afdf52-2ad7-4957-8bff-ca95fee13432" (UID: "d5afdf52-2ad7-4957-8bff-ca95fee13432"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.830024 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5afdf52-2ad7-4957-8bff-ca95fee13432" (UID: "d5afdf52-2ad7-4957-8bff-ca95fee13432"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.861042 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d5afdf52-2ad7-4957-8bff-ca95fee13432" (UID: "d5afdf52-2ad7-4957-8bff-ca95fee13432"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.913257 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.913709 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:27 crc kubenswrapper[4717]: I0217 15:12:27.913725 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5afdf52-2ad7-4957-8bff-ca95fee13432-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.081926 4717 generic.go:334] "Generic (PLEG): container finished" podID="4e706234-18f2-467c-8681-79402d25eb1c" containerID="9b0e8901f70ef1717d2e1dc0e35c08ede9faad7674cd112478e5224ef0aef980" exitCode=0 Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.081981 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c77f44b65-wwnph" event={"ID":"4e706234-18f2-467c-8681-79402d25eb1c","Type":"ContainerDied","Data":"9b0e8901f70ef1717d2e1dc0e35c08ede9faad7674cd112478e5224ef0aef980"} Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.093382 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" event={"ID":"914980ae-b392-46a5-bf14-0a80569736f1","Type":"ContainerDied","Data":"b1813be82f008b47a6087379b0e8d011a5bf7fe4b5c837887018fca5b0b37a37"} Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.093440 4717 scope.go:117] "RemoveContainer" containerID="391415c6ff38983bcc46be63680ab336e94d01c42209ece053b124e57131cce7" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.093590 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b7b6cb599-wtwlr" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.097772 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.115183 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"3d7d2d5c132a29c2340cf1cc33703bdf4ab457fd74681397080d36863a741a0c"} Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.134785 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-blhdn" event={"ID":"d5afdf52-2ad7-4957-8bff-ca95fee13432","Type":"ContainerDied","Data":"aef90f1a9830717ce75a35d71555ca8b2057e84b0878bd8c4898dd3d973f4778"} Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.134908 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-blhdn" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.136035 4717 scope.go:117] "RemoveContainer" containerID="9a94cf52f360a961c2d2d73ebe452ee9cbd065ac00089c98ebdd0379dbefc7a2" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.151910 4717 generic.go:334] "Generic (PLEG): container finished" podID="f0360385-5ad6-4d9b-b825-1cae0c23d814" containerID="c4724f7dcdc72e5dac94a357887527f8daee7f38df85c48e6a901084055c22f9" exitCode=0 Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.152068 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" containerName="sg-core" containerID="cri-o://04306c04a60664c6f4bcf509c853df0bfc861b7f61f0d79be121eacc4762382e" gracePeriod=30 Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.152229 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff59556db-hl8w5" event={"ID":"f0360385-5ad6-4d9b-b825-1cae0c23d814","Type":"ContainerDied","Data":"c4724f7dcdc72e5dac94a357887527f8daee7f38df85c48e6a901084055c22f9"} Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.176494 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6b7b6cb599-wtwlr"] Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.183624 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6b7b6cb599-wtwlr"] Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.229582 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-config-data-custom\") pod \"bb4300fb-9575-47b9-89a3-ca587932319f\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.229661 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-config-data\") pod \"bb4300fb-9575-47b9-89a3-ca587932319f\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.229691 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-combined-ca-bundle\") pod \"bb4300fb-9575-47b9-89a3-ca587932319f\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.229783 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4300fb-9575-47b9-89a3-ca587932319f-logs\") pod \"bb4300fb-9575-47b9-89a3-ca587932319f\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.229913 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffq8s\" (UniqueName: \"kubernetes.io/projected/bb4300fb-9575-47b9-89a3-ca587932319f-kube-api-access-ffq8s\") pod \"bb4300fb-9575-47b9-89a3-ca587932319f\" (UID: \"bb4300fb-9575-47b9-89a3-ca587932319f\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.237313 4717 scope.go:117] "RemoveContainer" containerID="915b734f054a00c83e2030292e96a88928e5a19cd6fb858daa06eab626edeb61" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.238814 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb4300fb-9575-47b9-89a3-ca587932319f-logs" (OuterVolumeSpecName: "logs") pod "bb4300fb-9575-47b9-89a3-ca587932319f" (UID: "bb4300fb-9575-47b9-89a3-ca587932319f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.240901 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb4300fb-9575-47b9-89a3-ca587932319f-kube-api-access-ffq8s" (OuterVolumeSpecName: "kube-api-access-ffq8s") pod "bb4300fb-9575-47b9-89a3-ca587932319f" (UID: "bb4300fb-9575-47b9-89a3-ca587932319f"). InnerVolumeSpecName "kube-api-access-ffq8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.242494 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bb4300fb-9575-47b9-89a3-ca587932319f" (UID: "bb4300fb-9575-47b9-89a3-ca587932319f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.251410 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-blhdn"] Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.266154 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-blhdn"] Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.294792 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.301547 4717 scope.go:117] "RemoveContainer" containerID="22b85e01f2962e3483aedf7b29a00fb5a65948fcf4751ac2aab5382a8d9c3b96" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.315770 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb4300fb-9575-47b9-89a3-ca587932319f" (UID: "bb4300fb-9575-47b9-89a3-ca587932319f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.337397 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4300fb-9575-47b9-89a3-ca587932319f-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.337445 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffq8s\" (UniqueName: \"kubernetes.io/projected/bb4300fb-9575-47b9-89a3-ca587932319f-kube-api-access-ffq8s\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.337464 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.337477 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.366358 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-config-data" (OuterVolumeSpecName: "config-data") pod "bb4300fb-9575-47b9-89a3-ca587932319f" (UID: "bb4300fb-9575-47b9-89a3-ca587932319f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.438162 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-config-data\") pod \"f0360385-5ad6-4d9b-b825-1cae0c23d814\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.438244 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj287\" (UniqueName: \"kubernetes.io/projected/f0360385-5ad6-4d9b-b825-1cae0c23d814-kube-api-access-xj287\") pod \"f0360385-5ad6-4d9b-b825-1cae0c23d814\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.438355 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0360385-5ad6-4d9b-b825-1cae0c23d814-logs\") pod \"f0360385-5ad6-4d9b-b825-1cae0c23d814\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.438425 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-config-data-custom\") pod \"f0360385-5ad6-4d9b-b825-1cae0c23d814\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.438471 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-combined-ca-bundle\") pod \"f0360385-5ad6-4d9b-b825-1cae0c23d814\" (UID: \"f0360385-5ad6-4d9b-b825-1cae0c23d814\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.439063 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4300fb-9575-47b9-89a3-ca587932319f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.445644 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0360385-5ad6-4d9b-b825-1cae0c23d814-logs" (OuterVolumeSpecName: "logs") pod "f0360385-5ad6-4d9b-b825-1cae0c23d814" (UID: "f0360385-5ad6-4d9b-b825-1cae0c23d814"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.459456 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f0360385-5ad6-4d9b-b825-1cae0c23d814" (UID: "f0360385-5ad6-4d9b-b825-1cae0c23d814"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.469379 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0360385-5ad6-4d9b-b825-1cae0c23d814-kube-api-access-xj287" (OuterVolumeSpecName: "kube-api-access-xj287") pod "f0360385-5ad6-4d9b-b825-1cae0c23d814" (UID: "f0360385-5ad6-4d9b-b825-1cae0c23d814"). InnerVolumeSpecName "kube-api-access-xj287". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.487884 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0360385-5ad6-4d9b-b825-1cae0c23d814" (UID: "f0360385-5ad6-4d9b-b825-1cae0c23d814"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.544916 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-config-data" (OuterVolumeSpecName: "config-data") pod "f0360385-5ad6-4d9b-b825-1cae0c23d814" (UID: "f0360385-5ad6-4d9b-b825-1cae0c23d814"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.549678 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.549704 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.549715 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0360385-5ad6-4d9b-b825-1cae0c23d814-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.549726 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj287\" (UniqueName: \"kubernetes.io/projected/f0360385-5ad6-4d9b-b825-1cae0c23d814-kube-api-access-xj287\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.549738 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0360385-5ad6-4d9b-b825-1cae0c23d814-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.671399 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57fff66767-xkhmf"] Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.709201 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 15:12:28 crc kubenswrapper[4717]: W0217 15:12:28.727173 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55a43ce8_bd32_4179_a265_04112693da30.slice/crio-08bf4714804b49c3a250c6f4f718f38d549237cfc99347c89349843e102a60d4 WatchSource:0}: Error finding container 08bf4714804b49c3a250c6f4f718f38d549237cfc99347c89349843e102a60d4: Status 404 returned error can't find the container with id 08bf4714804b49c3a250c6f4f718f38d549237cfc99347c89349843e102a60d4 Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.737161 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.755157 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:12:28 crc kubenswrapper[4717]: W0217 15:12:28.811547 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8363289_fd40_402e_a82d_2d7954bdca28.slice/crio-8d76ba10e8e873bb2f265272d2312cf0927574e9f0acb746a45df6a8bcf78edd WatchSource:0}: Error finding container 8d76ba10e8e873bb2f265272d2312cf0927574e9f0acb746a45df6a8bcf78edd: Status 404 returned error can't find the container with id 8d76ba10e8e873bb2f265272d2312cf0927574e9f0acb746a45df6a8bcf78edd Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.812789 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b696c957f-jwh8l"] Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.858928 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-ovndb-tls-certs\") pod \"4e706234-18f2-467c-8681-79402d25eb1c\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.859201 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-combined-ca-bundle\") pod \"4e706234-18f2-467c-8681-79402d25eb1c\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.859331 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-httpd-config\") pod \"4e706234-18f2-467c-8681-79402d25eb1c\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.859430 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-internal-tls-certs\") pod \"4e706234-18f2-467c-8681-79402d25eb1c\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.862730 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phd9g\" (UniqueName: \"kubernetes.io/projected/4e706234-18f2-467c-8681-79402d25eb1c-kube-api-access-phd9g\") pod \"4e706234-18f2-467c-8681-79402d25eb1c\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.862837 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-config\") pod \"4e706234-18f2-467c-8681-79402d25eb1c\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.862975 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-public-tls-certs\") pod \"4e706234-18f2-467c-8681-79402d25eb1c\" (UID: \"4e706234-18f2-467c-8681-79402d25eb1c\") " Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.871236 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e706234-18f2-467c-8681-79402d25eb1c-kube-api-access-phd9g" (OuterVolumeSpecName: "kube-api-access-phd9g") pod "4e706234-18f2-467c-8681-79402d25eb1c" (UID: "4e706234-18f2-467c-8681-79402d25eb1c"). InnerVolumeSpecName "kube-api-access-phd9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.975342 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phd9g\" (UniqueName: \"kubernetes.io/projected/4e706234-18f2-467c-8681-79402d25eb1c-kube-api-access-phd9g\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:28 crc kubenswrapper[4717]: I0217 15:12:28.979215 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4e706234-18f2-467c-8681-79402d25eb1c" (UID: "4e706234-18f2-467c-8681-79402d25eb1c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.081193 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.115264 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4e706234-18f2-467c-8681-79402d25eb1c" (UID: "4e706234-18f2-467c-8681-79402d25eb1c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.158660 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4e706234-18f2-467c-8681-79402d25eb1c" (UID: "4e706234-18f2-467c-8681-79402d25eb1c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.161914 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-config" (OuterVolumeSpecName: "config") pod "4e706234-18f2-467c-8681-79402d25eb1c" (UID: "4e706234-18f2-467c-8681-79402d25eb1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.170807 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b696c957f-jwh8l" event={"ID":"d5fc5d05-07ba-476e-835e-61dfa2e9edc1","Type":"ContainerStarted","Data":"832727ca186fc215d7834abd2b89a6ffff9b0a966a55deb31630e20b74e780fb"} Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.183454 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.183486 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.183497 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.195594 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e706234-18f2-467c-8681-79402d25eb1c" (UID: "4e706234-18f2-467c-8681-79402d25eb1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.228064 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b587759fd-cw959" event={"ID":"bb4300fb-9575-47b9-89a3-ca587932319f","Type":"ContainerDied","Data":"28d9d8c2590431d72526cd361b4482ee3915e255478dd31295da28098ddf1b39"} Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.228217 4717 scope.go:117] "RemoveContainer" containerID="0125b6b630f7d917f4a478598b719c991ee85eb9dea460298b089bb1b90940a4" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.228560 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b587759fd-cw959" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.253646 4717 generic.go:334] "Generic (PLEG): container finished" podID="fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" containerID="04306c04a60664c6f4bcf509c853df0bfc861b7f61f0d79be121eacc4762382e" exitCode=2 Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.254326 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1","Type":"ContainerDied","Data":"04306c04a60664c6f4bcf509c853df0bfc861b7f61f0d79be121eacc4762382e"} Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.265799 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"24f19c4432b2436f8b6ae1f4a25cdce1fee7612f254642633a8576f27b5403d4"} Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.266953 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"fd71abe62daa5751845eeed6fed76a4ed6cb7f56bcc975a37724411ea8139e77"} Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.279383 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4e706234-18f2-467c-8681-79402d25eb1c" (UID: "4e706234-18f2-467c-8681-79402d25eb1c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.284743 4717 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.286350 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e706234-18f2-467c-8681-79402d25eb1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.296633 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-b587759fd-cw959"] Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.298298 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff59556db-hl8w5" event={"ID":"f0360385-5ad6-4d9b-b825-1cae0c23d814","Type":"ContainerDied","Data":"fb05ee5e24f80c254a9d3ad89c1d5d31327d0b1c77487b2f8d04f1fa4d88b04a"} Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.298239 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ff59556db-hl8w5" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.303724 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.304307 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c77f44b65-wwnph" event={"ID":"4e706234-18f2-467c-8681-79402d25eb1c","Type":"ContainerDied","Data":"d438099255207b27838d10495236229940290387c1284e06007791d59263513a"} Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.304421 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c77f44b65-wwnph" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.313468 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-b587759fd-cw959"] Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.313579 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fff66767-xkhmf" event={"ID":"8de54124-8e8d-4a21-b058-d1007d78d6ce","Type":"ContainerStarted","Data":"f990e3fb6ce569ba03954d617fc257aaf61fbb5f5aa31da685e4efc5b63171af"} Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.331946 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8363289-fd40-402e-a82d-2d7954bdca28","Type":"ContainerStarted","Data":"8d76ba10e8e873bb2f265272d2312cf0927574e9f0acb746a45df6a8bcf78edd"} Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.336622 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"55a43ce8-bd32-4179-a265-04112693da30","Type":"ContainerStarted","Data":"08bf4714804b49c3a250c6f4f718f38d549237cfc99347c89349843e102a60d4"} Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.365826 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c77f44b65-wwnph"] Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.379986 4717 scope.go:117] "RemoveContainer" containerID="d45680eb11560abe16fbe89f3ffc9f227762fcc0c00fee5c80af39723a48201d" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.403167 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c77f44b65-wwnph"] Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.413750 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7ff59556db-hl8w5"] Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.426253 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7ff59556db-hl8w5"] Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.492407 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-config-data\") pod \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.492805 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-scripts\") pod \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.492838 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-run-httpd\") pod \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.492867 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6q6v\" (UniqueName: \"kubernetes.io/projected/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-kube-api-access-v6q6v\") pod \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.492898 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-log-httpd\") pod \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.493016 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-sg-core-conf-yaml\") pod \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.493118 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-combined-ca-bundle\") pod \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\" (UID: \"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1\") " Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.493856 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" (UID: "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.499862 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" (UID: "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.500034 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" (UID: "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.501387 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-config-data" (OuterVolumeSpecName: "config-data") pod "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" (UID: "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.501860 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-scripts" (OuterVolumeSpecName: "scripts") pod "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" (UID: "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.505245 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-kube-api-access-v6q6v" (OuterVolumeSpecName: "kube-api-access-v6q6v") pod "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" (UID: "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1"). InnerVolumeSpecName "kube-api-access-v6q6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.538481 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" (UID: "fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.596550 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.596577 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.596586 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.596597 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.596605 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.596613 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6q6v\" (UniqueName: \"kubernetes.io/projected/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-kube-api-access-v6q6v\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.596622 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.617793 4717 scope.go:117] "RemoveContainer" containerID="c4724f7dcdc72e5dac94a357887527f8daee7f38df85c48e6a901084055c22f9" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.698559 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-85b46995b-rj5bq" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.746660 4717 scope.go:117] "RemoveContainer" containerID="3838e90e0e40e4cbb19960a2004379528c4e2eaff1928a49b1253b50209ed9d3" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.778561 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b88fd5cc6-dqjmc"] Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.778843 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b88fd5cc6-dqjmc" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon-log" containerID="cri-o://d8519b0f4b147248ba4d6f91a993baf664f8ea8a630bc4159f522032d1ffea67" gracePeriod=30 Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.779445 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b88fd5cc6-dqjmc" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon" containerID="cri-o://de13a9e87b8b1b1396a6b56df5eb0db2cb23cd17ad0fd1fa2115022517f5d512" gracePeriod=30 Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.791051 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b88fd5cc6-dqjmc" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.805278 4717 scope.go:117] "RemoveContainer" containerID="5b321fba023192bfa1cfb97df9238f2d1cd260a5dfa1a7792d062cbe0d32c773" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.861574 4717 scope.go:117] "RemoveContainer" containerID="9b0e8901f70ef1717d2e1dc0e35c08ede9faad7674cd112478e5224ef0aef980" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.868645 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e706234-18f2-467c-8681-79402d25eb1c" path="/var/lib/kubelet/pods/4e706234-18f2-467c-8681-79402d25eb1c/volumes" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.869283 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="914980ae-b392-46a5-bf14-0a80569736f1" path="/var/lib/kubelet/pods/914980ae-b392-46a5-bf14-0a80569736f1/volumes" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.869847 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb4300fb-9575-47b9-89a3-ca587932319f" path="/var/lib/kubelet/pods/bb4300fb-9575-47b9-89a3-ca587932319f/volumes" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.871669 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5afdf52-2ad7-4957-8bff-ca95fee13432" path="/var/lib/kubelet/pods/d5afdf52-2ad7-4957-8bff-ca95fee13432/volumes" Feb 17 15:12:29 crc kubenswrapper[4717]: I0217 15:12:29.872368 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0360385-5ad6-4d9b-b825-1cae0c23d814" path="/var/lib/kubelet/pods/f0360385-5ad6-4d9b-b825-1cae0c23d814/volumes" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.346773 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8363289-fd40-402e-a82d-2d7954bdca28","Type":"ContainerStarted","Data":"89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718"} Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.350855 4717 generic.go:334] "Generic (PLEG): container finished" podID="8de54124-8e8d-4a21-b058-d1007d78d6ce" containerID="c72e1bdaaf913dfadcfd34cb4d164b89c9bb3a05e781ade19d1ca8d7ea547e00" exitCode=0 Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.350939 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fff66767-xkhmf" event={"ID":"8de54124-8e8d-4a21-b058-d1007d78d6ce","Type":"ContainerDied","Data":"c72e1bdaaf913dfadcfd34cb4d164b89c9bb3a05e781ade19d1ca8d7ea547e00"} Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.353185 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b696c957f-jwh8l" event={"ID":"d5fc5d05-07ba-476e-835e-61dfa2e9edc1","Type":"ContainerStarted","Data":"74c4ee49b2a12a816d1f57a0742cb53f9388630c39fa9171078af34e85a2d885"} Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.353230 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b696c957f-jwh8l" event={"ID":"d5fc5d05-07ba-476e-835e-61dfa2e9edc1","Type":"ContainerStarted","Data":"6f0abbea1bb8b46f1ad65c04babead78baed0b0d40c092baf5caf43f11a2f0e7"} Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.353295 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.360772 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.360771 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1","Type":"ContainerDied","Data":"711b964646b2722c68c605a6aa40b482da7cad7bb4fb503552f432a7c9a2d6a2"} Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.360952 4717 scope.go:117] "RemoveContainer" containerID="04306c04a60664c6f4bcf509c853df0bfc861b7f61f0d79be121eacc4762382e" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.375852 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"518c6b94-614f-42fd-9016-122cdcfcb8c9","Type":"ContainerStarted","Data":"70263104b1927c7f0229427e01186df2978890ca5b3907f3b2f922df147c7e7e"} Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.412300 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b696c957f-jwh8l" podStartSLOduration=7.412281658 podStartE2EDuration="7.412281658s" podCreationTimestamp="2026-02-17 15:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:30.395919454 +0000 UTC m=+1216.811759940" watchObservedRunningTime="2026-02-17 15:12:30.412281658 +0000 UTC m=+1216.828122134" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.450027 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.470592 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498142 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:12:30 crc kubenswrapper[4717]: E0217 15:12:30.498576 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="914980ae-b392-46a5-bf14-0a80569736f1" containerName="barbican-worker-log" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498589 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="914980ae-b392-46a5-bf14-0a80569736f1" containerName="barbican-worker-log" Feb 17 15:12:30 crc kubenswrapper[4717]: E0217 15:12:30.498605 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5afdf52-2ad7-4957-8bff-ca95fee13432" containerName="dnsmasq-dns" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498611 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5afdf52-2ad7-4957-8bff-ca95fee13432" containerName="dnsmasq-dns" Feb 17 15:12:30 crc kubenswrapper[4717]: E0217 15:12:30.498617 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0360385-5ad6-4d9b-b825-1cae0c23d814" containerName="barbican-api-log" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498623 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0360385-5ad6-4d9b-b825-1cae0c23d814" containerName="barbican-api-log" Feb 17 15:12:30 crc kubenswrapper[4717]: E0217 15:12:30.498634 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" containerName="sg-core" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498641 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" containerName="sg-core" Feb 17 15:12:30 crc kubenswrapper[4717]: E0217 15:12:30.498652 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e706234-18f2-467c-8681-79402d25eb1c" containerName="neutron-httpd" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498658 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e706234-18f2-467c-8681-79402d25eb1c" containerName="neutron-httpd" Feb 17 15:12:30 crc kubenswrapper[4717]: E0217 15:12:30.498671 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5afdf52-2ad7-4957-8bff-ca95fee13432" containerName="init" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498678 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5afdf52-2ad7-4957-8bff-ca95fee13432" containerName="init" Feb 17 15:12:30 crc kubenswrapper[4717]: E0217 15:12:30.498686 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4300fb-9575-47b9-89a3-ca587932319f" containerName="barbican-keystone-listener-log" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498693 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4300fb-9575-47b9-89a3-ca587932319f" containerName="barbican-keystone-listener-log" Feb 17 15:12:30 crc kubenswrapper[4717]: E0217 15:12:30.498700 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4300fb-9575-47b9-89a3-ca587932319f" containerName="barbican-keystone-listener" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498706 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4300fb-9575-47b9-89a3-ca587932319f" containerName="barbican-keystone-listener" Feb 17 15:12:30 crc kubenswrapper[4717]: E0217 15:12:30.498716 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="914980ae-b392-46a5-bf14-0a80569736f1" containerName="barbican-worker" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498721 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="914980ae-b392-46a5-bf14-0a80569736f1" containerName="barbican-worker" Feb 17 15:12:30 crc kubenswrapper[4717]: E0217 15:12:30.498738 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0360385-5ad6-4d9b-b825-1cae0c23d814" containerName="barbican-api" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498744 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0360385-5ad6-4d9b-b825-1cae0c23d814" containerName="barbican-api" Feb 17 15:12:30 crc kubenswrapper[4717]: E0217 15:12:30.498754 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e706234-18f2-467c-8681-79402d25eb1c" containerName="neutron-api" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498760 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e706234-18f2-467c-8681-79402d25eb1c" containerName="neutron-api" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498910 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e706234-18f2-467c-8681-79402d25eb1c" containerName="neutron-api" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498920 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0360385-5ad6-4d9b-b825-1cae0c23d814" containerName="barbican-api-log" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498935 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4300fb-9575-47b9-89a3-ca587932319f" containerName="barbican-keystone-listener" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498947 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0360385-5ad6-4d9b-b825-1cae0c23d814" containerName="barbican-api" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498959 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="914980ae-b392-46a5-bf14-0a80569736f1" containerName="barbican-worker-log" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498969 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4300fb-9575-47b9-89a3-ca587932319f" containerName="barbican-keystone-listener-log" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498981 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e706234-18f2-467c-8681-79402d25eb1c" containerName="neutron-httpd" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.498990 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" containerName="sg-core" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.499002 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5afdf52-2ad7-4957-8bff-ca95fee13432" containerName="dnsmasq-dns" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.499012 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="914980ae-b392-46a5-bf14-0a80569736f1" containerName="barbican-worker" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.500615 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.507855 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.519029 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.519207 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.546168 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=112.661090428 podStartE2EDuration="2m14.546153734s" podCreationTimestamp="2026-02-17 15:10:16 +0000 UTC" firstStartedPulling="2026-02-17 15:11:54.908658879 +0000 UTC m=+1181.324499355" lastFinishedPulling="2026-02-17 15:12:16.793722185 +0000 UTC m=+1203.209562661" observedRunningTime="2026-02-17 15:12:30.518608653 +0000 UTC m=+1216.934449139" watchObservedRunningTime="2026-02-17 15:12:30.546153734 +0000 UTC m=+1216.961994210" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.631778 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.632134 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-scripts\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.632226 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ghtq\" (UniqueName: \"kubernetes.io/projected/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-kube-api-access-8ghtq\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.632504 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.632671 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-config-data\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.632696 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-log-httpd\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.632754 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-run-httpd\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.734252 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-config-data\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.734292 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-log-httpd\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.734327 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-run-httpd\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.734348 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.734412 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-scripts\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.734457 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ghtq\" (UniqueName: \"kubernetes.io/projected/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-kube-api-access-8ghtq\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.734498 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.734905 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-run-httpd\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.735542 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-log-httpd\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.740132 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.740388 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-scripts\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.740889 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.762033 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-config-data\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.766031 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ghtq\" (UniqueName: \"kubernetes.io/projected/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-kube-api-access-8ghtq\") pod \"ceilometer-0\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.805705 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57fff66767-xkhmf"] Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.836508 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-mxm6x"] Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.837935 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.839929 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.862813 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-mxm6x"] Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.871820 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.939062 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.939322 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfhxm\" (UniqueName: \"kubernetes.io/projected/834aebac-9e50-4e80-868c-231373fa208a-kube-api-access-bfhxm\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.939405 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.939517 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-config\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.939646 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-dns-svc\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:30 crc kubenswrapper[4717]: I0217 15:12:30.939728 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.042505 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-config\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.042965 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-dns-svc\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.043034 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.043125 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.043177 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfhxm\" (UniqueName: \"kubernetes.io/projected/834aebac-9e50-4e80-868c-231373fa208a-kube-api-access-bfhxm\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.043269 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.043680 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-config\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.044315 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.044790 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.044960 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-dns-svc\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.048256 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.063955 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfhxm\" (UniqueName: \"kubernetes.io/projected/834aebac-9e50-4e80-868c-231373fa208a-kube-api-access-bfhxm\") pod \"dnsmasq-dns-5784cf869f-mxm6x\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.177830 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.389768 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8363289-fd40-402e-a82d-2d7954bdca28","Type":"ContainerStarted","Data":"e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829"} Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.389999 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d8363289-fd40-402e-a82d-2d7954bdca28" containerName="cinder-api-log" containerID="cri-o://89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718" gracePeriod=30 Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.390309 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.390669 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d8363289-fd40-402e-a82d-2d7954bdca28" containerName="cinder-api" containerID="cri-o://e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829" gracePeriod=30 Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.405260 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57fff66767-xkhmf" podUID="8de54124-8e8d-4a21-b058-d1007d78d6ce" containerName="dnsmasq-dns" containerID="cri-o://0624a1914ce737529b0e20cd4cbb81860c5f30a90d19ad2aa74a695494c18639" gracePeriod=10 Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.405881 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fff66767-xkhmf" event={"ID":"8de54124-8e8d-4a21-b058-d1007d78d6ce","Type":"ContainerStarted","Data":"0624a1914ce737529b0e20cd4cbb81860c5f30a90d19ad2aa74a695494c18639"} Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.406019 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.445931 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.445910595 podStartE2EDuration="6.445910595s" podCreationTimestamp="2026-02-17 15:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:31.428559383 +0000 UTC m=+1217.844399889" watchObservedRunningTime="2026-02-17 15:12:31.445910595 +0000 UTC m=+1217.861751071" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.474346 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57fff66767-xkhmf" podStartSLOduration=7.474329841 podStartE2EDuration="7.474329841s" podCreationTimestamp="2026-02-17 15:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:31.47041795 +0000 UTC m=+1217.886258436" watchObservedRunningTime="2026-02-17 15:12:31.474329841 +0000 UTC m=+1217.890170317" Feb 17 15:12:31 crc kubenswrapper[4717]: I0217 15:12:31.944784 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1" path="/var/lib/kubelet/pods/fcc73c8c-2cbb-4bd2-bd45-c05a65865cf1/volumes" Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.039582 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.201253 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-mxm6x"] Feb 17 15:12:32 crc kubenswrapper[4717]: W0217 15:12:32.207637 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod834aebac_9e50_4e80_868c_231373fa208a.slice/crio-4e1a95c59e5e2e34b2e19b3ee2ed9ed07a64526da44b3ae78fed23154096465a WatchSource:0}: Error finding container 4e1a95c59e5e2e34b2e19b3ee2ed9ed07a64526da44b3ae78fed23154096465a: Status 404 returned error can't find the container with id 4e1a95c59e5e2e34b2e19b3ee2ed9ed07a64526da44b3ae78fed23154096465a Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.432523 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4","Type":"ContainerStarted","Data":"1a9a67b818009381d2a4826767920304abb28ff0ccd71d34f3c3330eaaeefa0c"} Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.441601 4717 generic.go:334] "Generic (PLEG): container finished" podID="8de54124-8e8d-4a21-b058-d1007d78d6ce" containerID="0624a1914ce737529b0e20cd4cbb81860c5f30a90d19ad2aa74a695494c18639" exitCode=0 Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.441699 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fff66767-xkhmf" event={"ID":"8de54124-8e8d-4a21-b058-d1007d78d6ce","Type":"ContainerDied","Data":"0624a1914ce737529b0e20cd4cbb81860c5f30a90d19ad2aa74a695494c18639"} Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.448445 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" event={"ID":"834aebac-9e50-4e80-868c-231373fa208a","Type":"ContainerStarted","Data":"4e1a95c59e5e2e34b2e19b3ee2ed9ed07a64526da44b3ae78fed23154096465a"} Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.454735 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8363289-fd40-402e-a82d-2d7954bdca28" containerID="89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718" exitCode=143 Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.454779 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8363289-fd40-402e-a82d-2d7954bdca28","Type":"ContainerDied","Data":"89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718"} Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.536266 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.729196 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-config\") pod \"8de54124-8e8d-4a21-b058-d1007d78d6ce\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.729323 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-ovsdbserver-nb\") pod \"8de54124-8e8d-4a21-b058-d1007d78d6ce\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.729356 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrtl8\" (UniqueName: \"kubernetes.io/projected/8de54124-8e8d-4a21-b058-d1007d78d6ce-kube-api-access-jrtl8\") pod \"8de54124-8e8d-4a21-b058-d1007d78d6ce\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.729406 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-dns-svc\") pod \"8de54124-8e8d-4a21-b058-d1007d78d6ce\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.729438 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-ovsdbserver-sb\") pod \"8de54124-8e8d-4a21-b058-d1007d78d6ce\" (UID: \"8de54124-8e8d-4a21-b058-d1007d78d6ce\") " Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.756356 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de54124-8e8d-4a21-b058-d1007d78d6ce-kube-api-access-jrtl8" (OuterVolumeSpecName: "kube-api-access-jrtl8") pod "8de54124-8e8d-4a21-b058-d1007d78d6ce" (UID: "8de54124-8e8d-4a21-b058-d1007d78d6ce"). InnerVolumeSpecName "kube-api-access-jrtl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.810325 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8de54124-8e8d-4a21-b058-d1007d78d6ce" (UID: "8de54124-8e8d-4a21-b058-d1007d78d6ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.815554 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8de54124-8e8d-4a21-b058-d1007d78d6ce" (UID: "8de54124-8e8d-4a21-b058-d1007d78d6ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.832268 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.832300 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrtl8\" (UniqueName: \"kubernetes.io/projected/8de54124-8e8d-4a21-b058-d1007d78d6ce-kube-api-access-jrtl8\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.832311 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.838003 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-config" (OuterVolumeSpecName: "config") pod "8de54124-8e8d-4a21-b058-d1007d78d6ce" (UID: "8de54124-8e8d-4a21-b058-d1007d78d6ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.856866 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8de54124-8e8d-4a21-b058-d1007d78d6ce" (UID: "8de54124-8e8d-4a21-b058-d1007d78d6ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.935068 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:32 crc kubenswrapper[4717]: I0217 15:12:32.935136 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8de54124-8e8d-4a21-b058-d1007d78d6ce-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:33 crc kubenswrapper[4717]: I0217 15:12:33.199915 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b88fd5cc6-dqjmc" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:33374->10.217.0.143:8443: read: connection reset by peer" Feb 17 15:12:33 crc kubenswrapper[4717]: I0217 15:12:33.482870 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fff66767-xkhmf" event={"ID":"8de54124-8e8d-4a21-b058-d1007d78d6ce","Type":"ContainerDied","Data":"f990e3fb6ce569ba03954d617fc257aaf61fbb5f5aa31da685e4efc5b63171af"} Feb 17 15:12:33 crc kubenswrapper[4717]: I0217 15:12:33.482923 4717 scope.go:117] "RemoveContainer" containerID="0624a1914ce737529b0e20cd4cbb81860c5f30a90d19ad2aa74a695494c18639" Feb 17 15:12:33 crc kubenswrapper[4717]: I0217 15:12:33.482953 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57fff66767-xkhmf" Feb 17 15:12:33 crc kubenswrapper[4717]: I0217 15:12:33.520986 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57fff66767-xkhmf"] Feb 17 15:12:33 crc kubenswrapper[4717]: I0217 15:12:33.530898 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57fff66767-xkhmf"] Feb 17 15:12:33 crc kubenswrapper[4717]: I0217 15:12:33.573335 4717 scope.go:117] "RemoveContainer" containerID="c72e1bdaaf913dfadcfd34cb4d164b89c9bb3a05e781ade19d1ca8d7ea547e00" Feb 17 15:12:33 crc kubenswrapper[4717]: I0217 15:12:33.861470 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de54124-8e8d-4a21-b058-d1007d78d6ce" path="/var/lib/kubelet/pods/8de54124-8e8d-4a21-b058-d1007d78d6ce/volumes" Feb 17 15:12:34 crc kubenswrapper[4717]: I0217 15:12:34.496183 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" event={"ID":"834aebac-9e50-4e80-868c-231373fa208a","Type":"ContainerStarted","Data":"c0ea11c31a144ba5f4b8c9c3ef241b602ae527ce24eb44b430fd30cc4ca68c6f"} Feb 17 15:12:34 crc kubenswrapper[4717]: I0217 15:12:34.497653 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"55a43ce8-bd32-4179-a265-04112693da30","Type":"ContainerStarted","Data":"9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867"} Feb 17 15:12:34 crc kubenswrapper[4717]: I0217 15:12:34.499914 4717 generic.go:334] "Generic (PLEG): container finished" podID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerID="de13a9e87b8b1b1396a6b56df5eb0db2cb23cd17ad0fd1fa2115022517f5d512" exitCode=0 Feb 17 15:12:34 crc kubenswrapper[4717]: I0217 15:12:34.499964 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b88fd5cc6-dqjmc" event={"ID":"946c8c31-01d1-45f7-87c2-a022100aeef9","Type":"ContainerDied","Data":"de13a9e87b8b1b1396a6b56df5eb0db2cb23cd17ad0fd1fa2115022517f5d512"} Feb 17 15:12:35 crc kubenswrapper[4717]: I0217 15:12:35.016226 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b88fd5cc6-dqjmc" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Feb 17 15:12:35 crc kubenswrapper[4717]: I0217 15:12:35.511792 4717 generic.go:334] "Generic (PLEG): container finished" podID="834aebac-9e50-4e80-868c-231373fa208a" containerID="c0ea11c31a144ba5f4b8c9c3ef241b602ae527ce24eb44b430fd30cc4ca68c6f" exitCode=0 Feb 17 15:12:35 crc kubenswrapper[4717]: I0217 15:12:35.511843 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" event={"ID":"834aebac-9e50-4e80-868c-231373fa208a","Type":"ContainerDied","Data":"c0ea11c31a144ba5f4b8c9c3ef241b602ae527ce24eb44b430fd30cc4ca68c6f"} Feb 17 15:12:36 crc kubenswrapper[4717]: I0217 15:12:36.528851 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" event={"ID":"834aebac-9e50-4e80-868c-231373fa208a","Type":"ContainerStarted","Data":"38261e17bd91695eea2b84a6bd26372cd543d893ec8226ac6dece833d75d680f"} Feb 17 15:12:36 crc kubenswrapper[4717]: I0217 15:12:36.529786 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:36 crc kubenswrapper[4717]: I0217 15:12:36.533073 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"55a43ce8-bd32-4179-a265-04112693da30","Type":"ContainerStarted","Data":"2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31"} Feb 17 15:12:36 crc kubenswrapper[4717]: I0217 15:12:36.557886 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" podStartSLOduration=6.557871026 podStartE2EDuration="6.557871026s" podCreationTimestamp="2026-02-17 15:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:36.554741168 +0000 UTC m=+1222.970581654" watchObservedRunningTime="2026-02-17 15:12:36.557871026 +0000 UTC m=+1222.973711502" Feb 17 15:12:36 crc kubenswrapper[4717]: I0217 15:12:36.585832 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=9.847239261 podStartE2EDuration="12.585816599s" podCreationTimestamp="2026-02-17 15:12:24 +0000 UTC" firstStartedPulling="2026-02-17 15:12:28.72930961 +0000 UTC m=+1215.145150086" lastFinishedPulling="2026-02-17 15:12:31.467886948 +0000 UTC m=+1217.883727424" observedRunningTime="2026-02-17 15:12:36.575500206 +0000 UTC m=+1222.991340692" watchObservedRunningTime="2026-02-17 15:12:36.585816599 +0000 UTC m=+1223.001657075" Feb 17 15:12:39 crc kubenswrapper[4717]: I0217 15:12:39.837544 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.077139 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.122914 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.483063 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.488801 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-96f9cc575-vd9jv" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.579480 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4","Type":"ContainerStarted","Data":"0ffa36f3aa17567b5726cc123f55bc87e4712b60fc91f1e4e9e672f4381f5a4c"} Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.626569 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.703071 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 15:12:40 crc kubenswrapper[4717]: E0217 15:12:40.703435 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de54124-8e8d-4a21-b058-d1007d78d6ce" containerName="init" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.703455 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de54124-8e8d-4a21-b058-d1007d78d6ce" containerName="init" Feb 17 15:12:40 crc kubenswrapper[4717]: E0217 15:12:40.703480 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de54124-8e8d-4a21-b058-d1007d78d6ce" containerName="dnsmasq-dns" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.703491 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de54124-8e8d-4a21-b058-d1007d78d6ce" containerName="dnsmasq-dns" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.703739 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de54124-8e8d-4a21-b058-d1007d78d6ce" containerName="dnsmasq-dns" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.704365 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.710509 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-2xndk" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.711130 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.711245 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.719683 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.806350 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8cf42885-3509-4779-901c-e88f11c5fdfd-openstack-config-secret\") pod \"openstackclient\" (UID: \"8cf42885-3509-4779-901c-e88f11c5fdfd\") " pod="openstack/openstackclient" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.806782 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8cf42885-3509-4779-901c-e88f11c5fdfd-openstack-config\") pod \"openstackclient\" (UID: \"8cf42885-3509-4779-901c-e88f11c5fdfd\") " pod="openstack/openstackclient" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.806848 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5644\" (UniqueName: \"kubernetes.io/projected/8cf42885-3509-4779-901c-e88f11c5fdfd-kube-api-access-l5644\") pod \"openstackclient\" (UID: \"8cf42885-3509-4779-901c-e88f11c5fdfd\") " pod="openstack/openstackclient" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.806877 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf42885-3509-4779-901c-e88f11c5fdfd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8cf42885-3509-4779-901c-e88f11c5fdfd\") " pod="openstack/openstackclient" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.910186 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8cf42885-3509-4779-901c-e88f11c5fdfd-openstack-config\") pod \"openstackclient\" (UID: \"8cf42885-3509-4779-901c-e88f11c5fdfd\") " pod="openstack/openstackclient" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.910273 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5644\" (UniqueName: \"kubernetes.io/projected/8cf42885-3509-4779-901c-e88f11c5fdfd-kube-api-access-l5644\") pod \"openstackclient\" (UID: \"8cf42885-3509-4779-901c-e88f11c5fdfd\") " pod="openstack/openstackclient" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.910308 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf42885-3509-4779-901c-e88f11c5fdfd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8cf42885-3509-4779-901c-e88f11c5fdfd\") " pod="openstack/openstackclient" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.912566 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8cf42885-3509-4779-901c-e88f11c5fdfd-openstack-config\") pod \"openstackclient\" (UID: \"8cf42885-3509-4779-901c-e88f11c5fdfd\") " pod="openstack/openstackclient" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.912874 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8cf42885-3509-4779-901c-e88f11c5fdfd-openstack-config-secret\") pod \"openstackclient\" (UID: \"8cf42885-3509-4779-901c-e88f11c5fdfd\") " pod="openstack/openstackclient" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.916654 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8cf42885-3509-4779-901c-e88f11c5fdfd-openstack-config-secret\") pod \"openstackclient\" (UID: \"8cf42885-3509-4779-901c-e88f11c5fdfd\") " pod="openstack/openstackclient" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.917007 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf42885-3509-4779-901c-e88f11c5fdfd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8cf42885-3509-4779-901c-e88f11c5fdfd\") " pod="openstack/openstackclient" Feb 17 15:12:40 crc kubenswrapper[4717]: I0217 15:12:40.932738 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5644\" (UniqueName: \"kubernetes.io/projected/8cf42885-3509-4779-901c-e88f11c5fdfd-kube-api-access-l5644\") pod \"openstackclient\" (UID: \"8cf42885-3509-4779-901c-e88f11c5fdfd\") " pod="openstack/openstackclient" Feb 17 15:12:41 crc kubenswrapper[4717]: I0217 15:12:41.031846 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 15:12:41 crc kubenswrapper[4717]: I0217 15:12:41.179324 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:12:41 crc kubenswrapper[4717]: I0217 15:12:41.259311 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-fw7hr"] Feb 17 15:12:41 crc kubenswrapper[4717]: I0217 15:12:41.259653 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" podUID="46311c1e-077b-4089-8a6e-c8497cd5b796" containerName="dnsmasq-dns" containerID="cri-o://2ad436fd972abed4c75eb2082b7da4008e624a7e59edbb26b6682c4677f1820e" gracePeriod=10 Feb 17 15:12:41 crc kubenswrapper[4717]: I0217 15:12:41.593119 4717 generic.go:334] "Generic (PLEG): container finished" podID="46311c1e-077b-4089-8a6e-c8497cd5b796" containerID="2ad436fd972abed4c75eb2082b7da4008e624a7e59edbb26b6682c4677f1820e" exitCode=0 Feb 17 15:12:41 crc kubenswrapper[4717]: I0217 15:12:41.593200 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" event={"ID":"46311c1e-077b-4089-8a6e-c8497cd5b796","Type":"ContainerDied","Data":"2ad436fd972abed4c75eb2082b7da4008e624a7e59edbb26b6682c4677f1820e"} Feb 17 15:12:41 crc kubenswrapper[4717]: I0217 15:12:41.593349 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="55a43ce8-bd32-4179-a265-04112693da30" containerName="cinder-scheduler" containerID="cri-o://9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867" gracePeriod=30 Feb 17 15:12:41 crc kubenswrapper[4717]: I0217 15:12:41.593411 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="55a43ce8-bd32-4179-a265-04112693da30" containerName="probe" containerID="cri-o://2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31" gracePeriod=30 Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.193302 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.248430 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-dns-svc\") pod \"46311c1e-077b-4089-8a6e-c8497cd5b796\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.248594 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-ovsdbserver-nb\") pod \"46311c1e-077b-4089-8a6e-c8497cd5b796\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.248660 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-ovsdbserver-sb\") pod \"46311c1e-077b-4089-8a6e-c8497cd5b796\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.248757 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99fjn\" (UniqueName: \"kubernetes.io/projected/46311c1e-077b-4089-8a6e-c8497cd5b796-kube-api-access-99fjn\") pod \"46311c1e-077b-4089-8a6e-c8497cd5b796\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.248858 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-config\") pod \"46311c1e-077b-4089-8a6e-c8497cd5b796\" (UID: \"46311c1e-077b-4089-8a6e-c8497cd5b796\") " Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.251317 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.267666 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46311c1e-077b-4089-8a6e-c8497cd5b796-kube-api-access-99fjn" (OuterVolumeSpecName: "kube-api-access-99fjn") pod "46311c1e-077b-4089-8a6e-c8497cd5b796" (UID: "46311c1e-077b-4089-8a6e-c8497cd5b796"). InnerVolumeSpecName "kube-api-access-99fjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:42 crc kubenswrapper[4717]: W0217 15:12:42.285897 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cf42885_3509_4779_901c_e88f11c5fdfd.slice/crio-4b5dd0c2dad23c83fdbeea574eed5bd8bdcb97cea7a42879a8e3c59e7d54ab77 WatchSource:0}: Error finding container 4b5dd0c2dad23c83fdbeea574eed5bd8bdcb97cea7a42879a8e3c59e7d54ab77: Status 404 returned error can't find the container with id 4b5dd0c2dad23c83fdbeea574eed5bd8bdcb97cea7a42879a8e3c59e7d54ab77 Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.327323 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46311c1e-077b-4089-8a6e-c8497cd5b796" (UID: "46311c1e-077b-4089-8a6e-c8497cd5b796"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.350283 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-config" (OuterVolumeSpecName: "config") pod "46311c1e-077b-4089-8a6e-c8497cd5b796" (UID: "46311c1e-077b-4089-8a6e-c8497cd5b796"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.351552 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99fjn\" (UniqueName: \"kubernetes.io/projected/46311c1e-077b-4089-8a6e-c8497cd5b796-kube-api-access-99fjn\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.351570 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.351579 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.360128 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46311c1e-077b-4089-8a6e-c8497cd5b796" (UID: "46311c1e-077b-4089-8a6e-c8497cd5b796"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.363129 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46311c1e-077b-4089-8a6e-c8497cd5b796" (UID: "46311c1e-077b-4089-8a6e-c8497cd5b796"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.453339 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.453582 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46311c1e-077b-4089-8a6e-c8497cd5b796-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.603610 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.603618 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-fw7hr" event={"ID":"46311c1e-077b-4089-8a6e-c8497cd5b796","Type":"ContainerDied","Data":"e3849f4fbbee51e34126e46baadd483453d9fe79d64beeaccdb2f94b9fd351fe"} Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.603943 4717 scope.go:117] "RemoveContainer" containerID="2ad436fd972abed4c75eb2082b7da4008e624a7e59edbb26b6682c4677f1820e" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.606449 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8cf42885-3509-4779-901c-e88f11c5fdfd","Type":"ContainerStarted","Data":"4b5dd0c2dad23c83fdbeea574eed5bd8bdcb97cea7a42879a8e3c59e7d54ab77"} Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.611203 4717 generic.go:334] "Generic (PLEG): container finished" podID="55a43ce8-bd32-4179-a265-04112693da30" containerID="2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31" exitCode=0 Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.611284 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"55a43ce8-bd32-4179-a265-04112693da30","Type":"ContainerDied","Data":"2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31"} Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.621126 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4","Type":"ContainerStarted","Data":"aa3b8b5ff9855549d1302342083239260b7758d2e94d94537bfac3ec3d903725"} Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.626008 4717 scope.go:117] "RemoveContainer" containerID="2099de09ac93e60fa3f259ec16f0d3700dad6773496ada71ff46ad3d902b4e30" Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.645133 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-fw7hr"] Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.661573 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-fw7hr"] Feb 17 15:12:42 crc kubenswrapper[4717]: I0217 15:12:42.976560 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.384804 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.493966 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7fb4dbdc46-npdpn" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.580207 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-567bd59758-rrcxt"] Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.580431 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-567bd59758-rrcxt" podUID="932dfe25-b5f6-4cc6-92f7-6594c8449263" containerName="placement-log" containerID="cri-o://09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1" gracePeriod=30 Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.580637 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-567bd59758-rrcxt" podUID="932dfe25-b5f6-4cc6-92f7-6594c8449263" containerName="placement-api" containerID="cri-o://2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0" gracePeriod=30 Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.628237 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.659581 4717 generic.go:334] "Generic (PLEG): container finished" podID="55a43ce8-bd32-4179-a265-04112693da30" containerID="9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867" exitCode=0 Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.659682 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.659697 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"55a43ce8-bd32-4179-a265-04112693da30","Type":"ContainerDied","Data":"9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867"} Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.659732 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"55a43ce8-bd32-4179-a265-04112693da30","Type":"ContainerDied","Data":"08bf4714804b49c3a250c6f4f718f38d549237cfc99347c89349843e102a60d4"} Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.659754 4717 scope.go:117] "RemoveContainer" containerID="2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.686391 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4","Type":"ContainerStarted","Data":"2ac53413e7c6847ab1251601cb05d12b1ad759d3861c77073c818853c34f9522"} Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.690255 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-scripts\") pod \"55a43ce8-bd32-4179-a265-04112693da30\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.690294 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-config-data-custom\") pod \"55a43ce8-bd32-4179-a265-04112693da30\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.690340 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-combined-ca-bundle\") pod \"55a43ce8-bd32-4179-a265-04112693da30\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.690449 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55a43ce8-bd32-4179-a265-04112693da30-etc-machine-id\") pod \"55a43ce8-bd32-4179-a265-04112693da30\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.690520 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-config-data\") pod \"55a43ce8-bd32-4179-a265-04112693da30\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.690586 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l6v6\" (UniqueName: \"kubernetes.io/projected/55a43ce8-bd32-4179-a265-04112693da30-kube-api-access-5l6v6\") pod \"55a43ce8-bd32-4179-a265-04112693da30\" (UID: \"55a43ce8-bd32-4179-a265-04112693da30\") " Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.694222 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55a43ce8-bd32-4179-a265-04112693da30-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "55a43ce8-bd32-4179-a265-04112693da30" (UID: "55a43ce8-bd32-4179-a265-04112693da30"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.700036 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a43ce8-bd32-4179-a265-04112693da30-kube-api-access-5l6v6" (OuterVolumeSpecName: "kube-api-access-5l6v6") pod "55a43ce8-bd32-4179-a265-04112693da30" (UID: "55a43ce8-bd32-4179-a265-04112693da30"). InnerVolumeSpecName "kube-api-access-5l6v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.700595 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "55a43ce8-bd32-4179-a265-04112693da30" (UID: "55a43ce8-bd32-4179-a265-04112693da30"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.703351 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-scripts" (OuterVolumeSpecName: "scripts") pod "55a43ce8-bd32-4179-a265-04112693da30" (UID: "55a43ce8-bd32-4179-a265-04112693da30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.726588 4717 scope.go:117] "RemoveContainer" containerID="9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.753608 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55a43ce8-bd32-4179-a265-04112693da30" (UID: "55a43ce8-bd32-4179-a265-04112693da30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.754673 4717 scope.go:117] "RemoveContainer" containerID="2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31" Feb 17 15:12:43 crc kubenswrapper[4717]: E0217 15:12:43.755206 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31\": container with ID starting with 2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31 not found: ID does not exist" containerID="2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.755245 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31"} err="failed to get container status \"2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31\": rpc error: code = NotFound desc = could not find container \"2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31\": container with ID starting with 2e1e6eb45d231a1239ae4a320276d708987a5d44fea349db3b5b0fc169414a31 not found: ID does not exist" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.755271 4717 scope.go:117] "RemoveContainer" containerID="9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867" Feb 17 15:12:43 crc kubenswrapper[4717]: E0217 15:12:43.756289 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867\": container with ID starting with 9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867 not found: ID does not exist" containerID="9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.756342 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867"} err="failed to get container status \"9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867\": rpc error: code = NotFound desc = could not find container \"9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867\": container with ID starting with 9efcb1eb9c377275df54f0399fc5146581077492067be3fc486f91b2ba0ce867 not found: ID does not exist" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.792005 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.792036 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.792046 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.792054 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55a43ce8-bd32-4179-a265-04112693da30-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.792063 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l6v6\" (UniqueName: \"kubernetes.io/projected/55a43ce8-bd32-4179-a265-04112693da30-kube-api-access-5l6v6\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.816290 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-config-data" (OuterVolumeSpecName: "config-data") pod "55a43ce8-bd32-4179-a265-04112693da30" (UID: "55a43ce8-bd32-4179-a265-04112693da30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.859963 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46311c1e-077b-4089-8a6e-c8497cd5b796" path="/var/lib/kubelet/pods/46311c1e-077b-4089-8a6e-c8497cd5b796/volumes" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.893639 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a43ce8-bd32-4179-a265-04112693da30-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.986289 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 15:12:43 crc kubenswrapper[4717]: I0217 15:12:43.999396 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.011192 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 15:12:44 crc kubenswrapper[4717]: E0217 15:12:44.011652 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46311c1e-077b-4089-8a6e-c8497cd5b796" containerName="init" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.011667 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="46311c1e-077b-4089-8a6e-c8497cd5b796" containerName="init" Feb 17 15:12:44 crc kubenswrapper[4717]: E0217 15:12:44.011692 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46311c1e-077b-4089-8a6e-c8497cd5b796" containerName="dnsmasq-dns" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.011699 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="46311c1e-077b-4089-8a6e-c8497cd5b796" containerName="dnsmasq-dns" Feb 17 15:12:44 crc kubenswrapper[4717]: E0217 15:12:44.011709 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a43ce8-bd32-4179-a265-04112693da30" containerName="cinder-scheduler" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.011714 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a43ce8-bd32-4179-a265-04112693da30" containerName="cinder-scheduler" Feb 17 15:12:44 crc kubenswrapper[4717]: E0217 15:12:44.011732 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a43ce8-bd32-4179-a265-04112693da30" containerName="probe" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.011738 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a43ce8-bd32-4179-a265-04112693da30" containerName="probe" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.011893 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a43ce8-bd32-4179-a265-04112693da30" containerName="probe" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.011918 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="46311c1e-077b-4089-8a6e-c8497cd5b796" containerName="dnsmasq-dns" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.011938 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a43ce8-bd32-4179-a265-04112693da30" containerName="cinder-scheduler" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.013132 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.021621 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.032826 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.200261 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7eef2c95-16ed-4b57-a95a-aa5b302ec564-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.200316 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7eef2c95-16ed-4b57-a95a-aa5b302ec564-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.200418 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eef2c95-16ed-4b57-a95a-aa5b302ec564-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.200462 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eef2c95-16ed-4b57-a95a-aa5b302ec564-scripts\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.200594 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eef2c95-16ed-4b57-a95a-aa5b302ec564-config-data\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.200712 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gw62\" (UniqueName: \"kubernetes.io/projected/7eef2c95-16ed-4b57-a95a-aa5b302ec564-kube-api-access-6gw62\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.301981 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eef2c95-16ed-4b57-a95a-aa5b302ec564-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.302030 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eef2c95-16ed-4b57-a95a-aa5b302ec564-scripts\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.302114 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eef2c95-16ed-4b57-a95a-aa5b302ec564-config-data\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.302162 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gw62\" (UniqueName: \"kubernetes.io/projected/7eef2c95-16ed-4b57-a95a-aa5b302ec564-kube-api-access-6gw62\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.302203 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7eef2c95-16ed-4b57-a95a-aa5b302ec564-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.302220 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7eef2c95-16ed-4b57-a95a-aa5b302ec564-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.302310 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7eef2c95-16ed-4b57-a95a-aa5b302ec564-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.307782 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eef2c95-16ed-4b57-a95a-aa5b302ec564-config-data\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.308344 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eef2c95-16ed-4b57-a95a-aa5b302ec564-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.308517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eef2c95-16ed-4b57-a95a-aa5b302ec564-scripts\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.316974 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7eef2c95-16ed-4b57-a95a-aa5b302ec564-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.324635 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gw62\" (UniqueName: \"kubernetes.io/projected/7eef2c95-16ed-4b57-a95a-aa5b302ec564-kube-api-access-6gw62\") pod \"cinder-scheduler-0\" (UID: \"7eef2c95-16ed-4b57-a95a-aa5b302ec564\") " pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.407468 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.756301 4717 generic.go:334] "Generic (PLEG): container finished" podID="932dfe25-b5f6-4cc6-92f7-6594c8449263" containerID="09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1" exitCode=143 Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.756566 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-567bd59758-rrcxt" event={"ID":"932dfe25-b5f6-4cc6-92f7-6594c8449263","Type":"ContainerDied","Data":"09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1"} Feb 17 15:12:44 crc kubenswrapper[4717]: I0217 15:12:44.905623 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 15:12:44 crc kubenswrapper[4717]: W0217 15:12:44.909802 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7eef2c95_16ed_4b57_a95a_aa5b302ec564.slice/crio-5ed7737d725954307c3749e1d67d9bec4cb5d2a2b172d2c5e7125a9ea9c30eb5 WatchSource:0}: Error finding container 5ed7737d725954307c3749e1d67d9bec4cb5d2a2b172d2c5e7125a9ea9c30eb5: Status 404 returned error can't find the container with id 5ed7737d725954307c3749e1d67d9bec4cb5d2a2b172d2c5e7125a9ea9c30eb5 Feb 17 15:12:45 crc kubenswrapper[4717]: I0217 15:12:45.015831 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b88fd5cc6-dqjmc" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Feb 17 15:12:45 crc kubenswrapper[4717]: I0217 15:12:45.779720 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7eef2c95-16ed-4b57-a95a-aa5b302ec564","Type":"ContainerStarted","Data":"909920d7f4a99e9d23ff9d0c87df7bf87b08f5f24e1f7f2970f65d607af02fa7"} Feb 17 15:12:45 crc kubenswrapper[4717]: I0217 15:12:45.780788 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7eef2c95-16ed-4b57-a95a-aa5b302ec564","Type":"ContainerStarted","Data":"5ed7737d725954307c3749e1d67d9bec4cb5d2a2b172d2c5e7125a9ea9c30eb5"} Feb 17 15:12:45 crc kubenswrapper[4717]: I0217 15:12:45.862034 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a43ce8-bd32-4179-a265-04112693da30" path="/var/lib/kubelet/pods/55a43ce8-bd32-4179-a265-04112693da30/volumes" Feb 17 15:12:46 crc kubenswrapper[4717]: I0217 15:12:46.606970 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:12:46 crc kubenswrapper[4717]: I0217 15:12:46.793929 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4","Type":"ContainerStarted","Data":"a0b65f0f7b866f77822127cb3df60c9c782c107ca199b4c83b3b3476d73b31c0"} Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.355920 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.376958 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-combined-ca-bundle\") pod \"932dfe25-b5f6-4cc6-92f7-6594c8449263\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.377099 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-config-data\") pod \"932dfe25-b5f6-4cc6-92f7-6594c8449263\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.377136 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99kv4\" (UniqueName: \"kubernetes.io/projected/932dfe25-b5f6-4cc6-92f7-6594c8449263-kube-api-access-99kv4\") pod \"932dfe25-b5f6-4cc6-92f7-6594c8449263\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.377178 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-scripts\") pod \"932dfe25-b5f6-4cc6-92f7-6594c8449263\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.377211 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-public-tls-certs\") pod \"932dfe25-b5f6-4cc6-92f7-6594c8449263\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.377250 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-internal-tls-certs\") pod \"932dfe25-b5f6-4cc6-92f7-6594c8449263\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.377288 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/932dfe25-b5f6-4cc6-92f7-6594c8449263-logs\") pod \"932dfe25-b5f6-4cc6-92f7-6594c8449263\" (UID: \"932dfe25-b5f6-4cc6-92f7-6594c8449263\") " Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.398340 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/932dfe25-b5f6-4cc6-92f7-6594c8449263-logs" (OuterVolumeSpecName: "logs") pod "932dfe25-b5f6-4cc6-92f7-6594c8449263" (UID: "932dfe25-b5f6-4cc6-92f7-6594c8449263"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.432437 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/932dfe25-b5f6-4cc6-92f7-6594c8449263-kube-api-access-99kv4" (OuterVolumeSpecName: "kube-api-access-99kv4") pod "932dfe25-b5f6-4cc6-92f7-6594c8449263" (UID: "932dfe25-b5f6-4cc6-92f7-6594c8449263"). InnerVolumeSpecName "kube-api-access-99kv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.474133 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-scripts" (OuterVolumeSpecName: "scripts") pod "932dfe25-b5f6-4cc6-92f7-6594c8449263" (UID: "932dfe25-b5f6-4cc6-92f7-6594c8449263"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.483646 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99kv4\" (UniqueName: \"kubernetes.io/projected/932dfe25-b5f6-4cc6-92f7-6594c8449263-kube-api-access-99kv4\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.483674 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.483687 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/932dfe25-b5f6-4cc6-92f7-6594c8449263-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.527207 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-config-data" (OuterVolumeSpecName: "config-data") pod "932dfe25-b5f6-4cc6-92f7-6594c8449263" (UID: "932dfe25-b5f6-4cc6-92f7-6594c8449263"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.527235 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "932dfe25-b5f6-4cc6-92f7-6594c8449263" (UID: "932dfe25-b5f6-4cc6-92f7-6594c8449263"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.533031 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "932dfe25-b5f6-4cc6-92f7-6594c8449263" (UID: "932dfe25-b5f6-4cc6-92f7-6594c8449263"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.585337 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.585398 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.585413 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.638885 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "932dfe25-b5f6-4cc6-92f7-6594c8449263" (UID: "932dfe25-b5f6-4cc6-92f7-6594c8449263"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.687168 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/932dfe25-b5f6-4cc6-92f7-6594c8449263-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.819904 4717 generic.go:334] "Generic (PLEG): container finished" podID="932dfe25-b5f6-4cc6-92f7-6594c8449263" containerID="2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0" exitCode=0 Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.819950 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-567bd59758-rrcxt" event={"ID":"932dfe25-b5f6-4cc6-92f7-6594c8449263","Type":"ContainerDied","Data":"2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0"} Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.819994 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-567bd59758-rrcxt" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.820030 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-567bd59758-rrcxt" event={"ID":"932dfe25-b5f6-4cc6-92f7-6594c8449263","Type":"ContainerDied","Data":"6df84643c8d7bebb2c7877075b5b1812acd717d08ec60c1df1651f1d8d0ee248"} Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.820055 4717 scope.go:117] "RemoveContainer" containerID="2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.824974 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7eef2c95-16ed-4b57-a95a-aa5b302ec564","Type":"ContainerStarted","Data":"ad5c5d5339b47eb9602d0f1b7dbc4546c61fd19f385c4b817b73baa2b727fc8b"} Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.825485 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="ceilometer-central-agent" containerID="cri-o://0ffa36f3aa17567b5726cc123f55bc87e4712b60fc91f1e4e9e672f4381f5a4c" gracePeriod=30 Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.825611 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="proxy-httpd" containerID="cri-o://a0b65f0f7b866f77822127cb3df60c9c782c107ca199b4c83b3b3476d73b31c0" gracePeriod=30 Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.825670 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="sg-core" containerID="cri-o://2ac53413e7c6847ab1251601cb05d12b1ad759d3861c77073c818853c34f9522" gracePeriod=30 Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.825719 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="ceilometer-notification-agent" containerID="cri-o://aa3b8b5ff9855549d1302342083239260b7758d2e94d94537bfac3ec3d903725" gracePeriod=30 Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.863598 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.863574405 podStartE2EDuration="4.863574405s" podCreationTimestamp="2026-02-17 15:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:47.842573259 +0000 UTC m=+1234.258413735" watchObservedRunningTime="2026-02-17 15:12:47.863574405 +0000 UTC m=+1234.279414881" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.867425 4717 scope.go:117] "RemoveContainer" containerID="09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.883836 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-567bd59758-rrcxt"] Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.896470 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-567bd59758-rrcxt"] Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.896651 4717 scope.go:117] "RemoveContainer" containerID="2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0" Feb 17 15:12:47 crc kubenswrapper[4717]: E0217 15:12:47.897143 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0\": container with ID starting with 2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0 not found: ID does not exist" containerID="2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.897188 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0"} err="failed to get container status \"2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0\": rpc error: code = NotFound desc = could not find container \"2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0\": container with ID starting with 2cc06c094d64fd1b11efbaebdd7cbf4742e145bd8c7a138b4ede8434d34907d0 not found: ID does not exist" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.897219 4717 scope.go:117] "RemoveContainer" containerID="09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1" Feb 17 15:12:47 crc kubenswrapper[4717]: E0217 15:12:47.897654 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1\": container with ID starting with 09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1 not found: ID does not exist" containerID="09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.897693 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1"} err="failed to get container status \"09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1\": rpc error: code = NotFound desc = could not find container \"09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1\": container with ID starting with 09da9d92c0afb5f67582ac704798b9e4ec305c8af2b73af49b9eb3ba914ed8a1 not found: ID does not exist" Feb 17 15:12:47 crc kubenswrapper[4717]: I0217 15:12:47.905068 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.810394145 podStartE2EDuration="17.905052842s" podCreationTimestamp="2026-02-17 15:12:30 +0000 UTC" firstStartedPulling="2026-02-17 15:12:32.073944901 +0000 UTC m=+1218.489785377" lastFinishedPulling="2026-02-17 15:12:46.168603598 +0000 UTC m=+1232.584444074" observedRunningTime="2026-02-17 15:12:47.896942282 +0000 UTC m=+1234.312782768" watchObservedRunningTime="2026-02-17 15:12:47.905052842 +0000 UTC m=+1234.320893318" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.406009 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-874f74f55-n92h5"] Feb 17 15:12:48 crc kubenswrapper[4717]: E0217 15:12:48.406593 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="932dfe25-b5f6-4cc6-92f7-6594c8449263" containerName="placement-log" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.406609 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="932dfe25-b5f6-4cc6-92f7-6594c8449263" containerName="placement-log" Feb 17 15:12:48 crc kubenswrapper[4717]: E0217 15:12:48.406657 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="932dfe25-b5f6-4cc6-92f7-6594c8449263" containerName="placement-api" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.406665 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="932dfe25-b5f6-4cc6-92f7-6594c8449263" containerName="placement-api" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.406814 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="932dfe25-b5f6-4cc6-92f7-6594c8449263" containerName="placement-log" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.406866 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="932dfe25-b5f6-4cc6-92f7-6594c8449263" containerName="placement-api" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.407768 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.411247 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.411454 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.411578 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.436669 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-874f74f55-n92h5"] Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.501696 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8206d35f-44b8-45f5-9286-8e4179701b96-etc-swift\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.501760 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8206d35f-44b8-45f5-9286-8e4179701b96-combined-ca-bundle\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.501831 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8206d35f-44b8-45f5-9286-8e4179701b96-log-httpd\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.501906 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8206d35f-44b8-45f5-9286-8e4179701b96-public-tls-certs\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.501929 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8206d35f-44b8-45f5-9286-8e4179701b96-run-httpd\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.501956 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8206d35f-44b8-45f5-9286-8e4179701b96-internal-tls-certs\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.501982 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4kbq\" (UniqueName: \"kubernetes.io/projected/8206d35f-44b8-45f5-9286-8e4179701b96-kube-api-access-k4kbq\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.502018 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8206d35f-44b8-45f5-9286-8e4179701b96-config-data\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.603446 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8206d35f-44b8-45f5-9286-8e4179701b96-etc-swift\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.603502 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8206d35f-44b8-45f5-9286-8e4179701b96-combined-ca-bundle\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.603565 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8206d35f-44b8-45f5-9286-8e4179701b96-log-httpd\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.603632 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8206d35f-44b8-45f5-9286-8e4179701b96-public-tls-certs\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.603653 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8206d35f-44b8-45f5-9286-8e4179701b96-run-httpd\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.603674 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8206d35f-44b8-45f5-9286-8e4179701b96-internal-tls-certs\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.603696 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4kbq\" (UniqueName: \"kubernetes.io/projected/8206d35f-44b8-45f5-9286-8e4179701b96-kube-api-access-k4kbq\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.603723 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8206d35f-44b8-45f5-9286-8e4179701b96-config-data\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.605354 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8206d35f-44b8-45f5-9286-8e4179701b96-log-httpd\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.605414 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8206d35f-44b8-45f5-9286-8e4179701b96-run-httpd\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.615716 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8206d35f-44b8-45f5-9286-8e4179701b96-combined-ca-bundle\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.618795 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8206d35f-44b8-45f5-9286-8e4179701b96-config-data\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.619720 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8206d35f-44b8-45f5-9286-8e4179701b96-internal-tls-certs\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.625907 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8206d35f-44b8-45f5-9286-8e4179701b96-public-tls-certs\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.626681 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4kbq\" (UniqueName: \"kubernetes.io/projected/8206d35f-44b8-45f5-9286-8e4179701b96-kube-api-access-k4kbq\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.627515 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8206d35f-44b8-45f5-9286-8e4179701b96-etc-swift\") pod \"swift-proxy-874f74f55-n92h5\" (UID: \"8206d35f-44b8-45f5-9286-8e4179701b96\") " pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.685508 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xzx8s"] Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.687575 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xzx8s" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.698793 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xzx8s"] Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.746138 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.807061 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8shn\" (UniqueName: \"kubernetes.io/projected/062a7610-89de-4af4-a35d-b965eff08320-kube-api-access-w8shn\") pod \"nova-api-db-create-xzx8s\" (UID: \"062a7610-89de-4af4-a35d-b965eff08320\") " pod="openstack/nova-api-db-create-xzx8s" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.807165 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/062a7610-89de-4af4-a35d-b965eff08320-operator-scripts\") pod \"nova-api-db-create-xzx8s\" (UID: \"062a7610-89de-4af4-a35d-b965eff08320\") " pod="openstack/nova-api-db-create-xzx8s" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.851504 4717 generic.go:334] "Generic (PLEG): container finished" podID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerID="a0b65f0f7b866f77822127cb3df60c9c782c107ca199b4c83b3b3476d73b31c0" exitCode=0 Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.851551 4717 generic.go:334] "Generic (PLEG): container finished" podID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerID="2ac53413e7c6847ab1251601cb05d12b1ad759d3861c77073c818853c34f9522" exitCode=2 Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.851564 4717 generic.go:334] "Generic (PLEG): container finished" podID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerID="aa3b8b5ff9855549d1302342083239260b7758d2e94d94537bfac3ec3d903725" exitCode=0 Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.851574 4717 generic.go:334] "Generic (PLEG): container finished" podID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerID="0ffa36f3aa17567b5726cc123f55bc87e4712b60fc91f1e4e9e672f4381f5a4c" exitCode=0 Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.852575 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4","Type":"ContainerDied","Data":"a0b65f0f7b866f77822127cb3df60c9c782c107ca199b4c83b3b3476d73b31c0"} Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.852608 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4","Type":"ContainerDied","Data":"2ac53413e7c6847ab1251601cb05d12b1ad759d3861c77073c818853c34f9522"} Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.852624 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4","Type":"ContainerDied","Data":"aa3b8b5ff9855549d1302342083239260b7758d2e94d94537bfac3ec3d903725"} Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.852636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4","Type":"ContainerDied","Data":"0ffa36f3aa17567b5726cc123f55bc87e4712b60fc91f1e4e9e672f4381f5a4c"} Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.894934 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9xpzr"] Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.897380 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9xpzr" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.909398 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/062a7610-89de-4af4-a35d-b965eff08320-operator-scripts\") pod \"nova-api-db-create-xzx8s\" (UID: \"062a7610-89de-4af4-a35d-b965eff08320\") " pod="openstack/nova-api-db-create-xzx8s" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.910666 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/062a7610-89de-4af4-a35d-b965eff08320-operator-scripts\") pod \"nova-api-db-create-xzx8s\" (UID: \"062a7610-89de-4af4-a35d-b965eff08320\") " pod="openstack/nova-api-db-create-xzx8s" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.911712 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8shn\" (UniqueName: \"kubernetes.io/projected/062a7610-89de-4af4-a35d-b965eff08320-kube-api-access-w8shn\") pod \"nova-api-db-create-xzx8s\" (UID: \"062a7610-89de-4af4-a35d-b965eff08320\") " pod="openstack/nova-api-db-create-xzx8s" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.934691 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8shn\" (UniqueName: \"kubernetes.io/projected/062a7610-89de-4af4-a35d-b965eff08320-kube-api-access-w8shn\") pod \"nova-api-db-create-xzx8s\" (UID: \"062a7610-89de-4af4-a35d-b965eff08320\") " pod="openstack/nova-api-db-create-xzx8s" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.941484 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9xpzr"] Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.975400 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c075-account-create-update-d8f2k"] Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.984866 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c075-account-create-update-d8f2k" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.988970 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 17 15:12:48 crc kubenswrapper[4717]: I0217 15:12:48.994831 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c075-account-create-update-d8f2k"] Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.012747 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02c1955-1938-4cac-b38e-e1eec7332813-operator-scripts\") pod \"nova-cell0-db-create-9xpzr\" (UID: \"d02c1955-1938-4cac-b38e-e1eec7332813\") " pod="openstack/nova-cell0-db-create-9xpzr" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.012831 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq8s6\" (UniqueName: \"kubernetes.io/projected/d02c1955-1938-4cac-b38e-e1eec7332813-kube-api-access-pq8s6\") pod \"nova-cell0-db-create-9xpzr\" (UID: \"d02c1955-1938-4cac-b38e-e1eec7332813\") " pod="openstack/nova-cell0-db-create-9xpzr" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.014842 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-6nlkw"] Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.016353 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6nlkw" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.039674 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6nlkw"] Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.053521 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xzx8s" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.110985 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-bc18-account-create-update-qs9qz"] Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.112037 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.116378 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.117191 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700-operator-scripts\") pod \"nova-api-c075-account-create-update-d8f2k\" (UID: \"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700\") " pod="openstack/nova-api-c075-account-create-update-d8f2k" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.117368 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02c1955-1938-4cac-b38e-e1eec7332813-operator-scripts\") pod \"nova-cell0-db-create-9xpzr\" (UID: \"d02c1955-1938-4cac-b38e-e1eec7332813\") " pod="openstack/nova-cell0-db-create-9xpzr" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.118143 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq8s6\" (UniqueName: \"kubernetes.io/projected/d02c1955-1938-4cac-b38e-e1eec7332813-kube-api-access-pq8s6\") pod \"nova-cell0-db-create-9xpzr\" (UID: \"d02c1955-1938-4cac-b38e-e1eec7332813\") " pod="openstack/nova-cell0-db-create-9xpzr" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.118206 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9s8q\" (UniqueName: \"kubernetes.io/projected/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700-kube-api-access-q9s8q\") pod \"nova-api-c075-account-create-update-d8f2k\" (UID: \"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700\") " pod="openstack/nova-api-c075-account-create-update-d8f2k" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.118163 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02c1955-1938-4cac-b38e-e1eec7332813-operator-scripts\") pod \"nova-cell0-db-create-9xpzr\" (UID: \"d02c1955-1938-4cac-b38e-e1eec7332813\") " pod="openstack/nova-cell0-db-create-9xpzr" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.125531 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bc18-account-create-update-qs9qz"] Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.136224 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq8s6\" (UniqueName: \"kubernetes.io/projected/d02c1955-1938-4cac-b38e-e1eec7332813-kube-api-access-pq8s6\") pod \"nova-cell0-db-create-9xpzr\" (UID: \"d02c1955-1938-4cac-b38e-e1eec7332813\") " pod="openstack/nova-cell0-db-create-9xpzr" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.219665 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47393b1e-7c8f-491e-a25a-0ef15e7eef3d-operator-scripts\") pod \"nova-cell1-db-create-6nlkw\" (UID: \"47393b1e-7c8f-491e-a25a-0ef15e7eef3d\") " pod="openstack/nova-cell1-db-create-6nlkw" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.219718 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzrc9\" (UniqueName: \"kubernetes.io/projected/47393b1e-7c8f-491e-a25a-0ef15e7eef3d-kube-api-access-bzrc9\") pod \"nova-cell1-db-create-6nlkw\" (UID: \"47393b1e-7c8f-491e-a25a-0ef15e7eef3d\") " pod="openstack/nova-cell1-db-create-6nlkw" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.219745 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700-operator-scripts\") pod \"nova-api-c075-account-create-update-d8f2k\" (UID: \"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700\") " pod="openstack/nova-api-c075-account-create-update-d8f2k" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.219839 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtrb2\" (UniqueName: \"kubernetes.io/projected/7970d67a-9fb0-493c-a143-bc2fee1d4c08-kube-api-access-jtrb2\") pod \"nova-cell0-bc18-account-create-update-qs9qz\" (UID: \"7970d67a-9fb0-493c-a143-bc2fee1d4c08\") " pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.219881 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9s8q\" (UniqueName: \"kubernetes.io/projected/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700-kube-api-access-q9s8q\") pod \"nova-api-c075-account-create-update-d8f2k\" (UID: \"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700\") " pod="openstack/nova-api-c075-account-create-update-d8f2k" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.219899 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7970d67a-9fb0-493c-a143-bc2fee1d4c08-operator-scripts\") pod \"nova-cell0-bc18-account-create-update-qs9qz\" (UID: \"7970d67a-9fb0-493c-a143-bc2fee1d4c08\") " pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.221160 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700-operator-scripts\") pod \"nova-api-c075-account-create-update-d8f2k\" (UID: \"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700\") " pod="openstack/nova-api-c075-account-create-update-d8f2k" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.241573 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9xpzr" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.241752 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9s8q\" (UniqueName: \"kubernetes.io/projected/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700-kube-api-access-q9s8q\") pod \"nova-api-c075-account-create-update-d8f2k\" (UID: \"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700\") " pod="openstack/nova-api-c075-account-create-update-d8f2k" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.307685 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7260-account-create-update-wtt26"] Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.308819 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7260-account-create-update-wtt26" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.311065 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.312266 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c075-account-create-update-d8f2k" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.324693 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtrb2\" (UniqueName: \"kubernetes.io/projected/7970d67a-9fb0-493c-a143-bc2fee1d4c08-kube-api-access-jtrb2\") pod \"nova-cell0-bc18-account-create-update-qs9qz\" (UID: \"7970d67a-9fb0-493c-a143-bc2fee1d4c08\") " pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.324760 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f67dea-44d1-487d-9981-d758db840579-operator-scripts\") pod \"nova-cell1-7260-account-create-update-wtt26\" (UID: \"70f67dea-44d1-487d-9981-d758db840579\") " pod="openstack/nova-cell1-7260-account-create-update-wtt26" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.324805 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8wpx\" (UniqueName: \"kubernetes.io/projected/70f67dea-44d1-487d-9981-d758db840579-kube-api-access-b8wpx\") pod \"nova-cell1-7260-account-create-update-wtt26\" (UID: \"70f67dea-44d1-487d-9981-d758db840579\") " pod="openstack/nova-cell1-7260-account-create-update-wtt26" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.324839 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7970d67a-9fb0-493c-a143-bc2fee1d4c08-operator-scripts\") pod \"nova-cell0-bc18-account-create-update-qs9qz\" (UID: \"7970d67a-9fb0-493c-a143-bc2fee1d4c08\") " pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.336576 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47393b1e-7c8f-491e-a25a-0ef15e7eef3d-operator-scripts\") pod \"nova-cell1-db-create-6nlkw\" (UID: \"47393b1e-7c8f-491e-a25a-0ef15e7eef3d\") " pod="openstack/nova-cell1-db-create-6nlkw" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.337201 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzrc9\" (UniqueName: \"kubernetes.io/projected/47393b1e-7c8f-491e-a25a-0ef15e7eef3d-kube-api-access-bzrc9\") pod \"nova-cell1-db-create-6nlkw\" (UID: \"47393b1e-7c8f-491e-a25a-0ef15e7eef3d\") " pod="openstack/nova-cell1-db-create-6nlkw" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.342037 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7970d67a-9fb0-493c-a143-bc2fee1d4c08-operator-scripts\") pod \"nova-cell0-bc18-account-create-update-qs9qz\" (UID: \"7970d67a-9fb0-493c-a143-bc2fee1d4c08\") " pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.342104 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47393b1e-7c8f-491e-a25a-0ef15e7eef3d-operator-scripts\") pod \"nova-cell1-db-create-6nlkw\" (UID: \"47393b1e-7c8f-491e-a25a-0ef15e7eef3d\") " pod="openstack/nova-cell1-db-create-6nlkw" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.345051 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7260-account-create-update-wtt26"] Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.371476 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzrc9\" (UniqueName: \"kubernetes.io/projected/47393b1e-7c8f-491e-a25a-0ef15e7eef3d-kube-api-access-bzrc9\") pod \"nova-cell1-db-create-6nlkw\" (UID: \"47393b1e-7c8f-491e-a25a-0ef15e7eef3d\") " pod="openstack/nova-cell1-db-create-6nlkw" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.379673 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtrb2\" (UniqueName: \"kubernetes.io/projected/7970d67a-9fb0-493c-a143-bc2fee1d4c08-kube-api-access-jtrb2\") pod \"nova-cell0-bc18-account-create-update-qs9qz\" (UID: \"7970d67a-9fb0-493c-a143-bc2fee1d4c08\") " pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.407612 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.450102 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.452110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f67dea-44d1-487d-9981-d758db840579-operator-scripts\") pod \"nova-cell1-7260-account-create-update-wtt26\" (UID: \"70f67dea-44d1-487d-9981-d758db840579\") " pod="openstack/nova-cell1-7260-account-create-update-wtt26" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.452149 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8wpx\" (UniqueName: \"kubernetes.io/projected/70f67dea-44d1-487d-9981-d758db840579-kube-api-access-b8wpx\") pod \"nova-cell1-7260-account-create-update-wtt26\" (UID: \"70f67dea-44d1-487d-9981-d758db840579\") " pod="openstack/nova-cell1-7260-account-create-update-wtt26" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.453252 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f67dea-44d1-487d-9981-d758db840579-operator-scripts\") pod \"nova-cell1-7260-account-create-update-wtt26\" (UID: \"70f67dea-44d1-487d-9981-d758db840579\") " pod="openstack/nova-cell1-7260-account-create-update-wtt26" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.472303 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8wpx\" (UniqueName: \"kubernetes.io/projected/70f67dea-44d1-487d-9981-d758db840579-kube-api-access-b8wpx\") pod \"nova-cell1-7260-account-create-update-wtt26\" (UID: \"70f67dea-44d1-487d-9981-d758db840579\") " pod="openstack/nova-cell1-7260-account-create-update-wtt26" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.488771 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7260-account-create-update-wtt26" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.555302 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-874f74f55-n92h5"] Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.626368 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.655362 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-run-httpd\") pod \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.655802 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-config-data\") pod \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.655869 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ghtq\" (UniqueName: \"kubernetes.io/projected/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-kube-api-access-8ghtq\") pod \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.655933 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-scripts\") pod \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.655970 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-log-httpd\") pod \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.656001 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-sg-core-conf-yaml\") pod \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.656048 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-combined-ca-bundle\") pod \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\" (UID: \"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4\") " Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.658135 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" (UID: "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.662930 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" (UID: "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.666472 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-scripts" (OuterVolumeSpecName: "scripts") pod "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" (UID: "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.666503 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-kube-api-access-8ghtq" (OuterVolumeSpecName: "kube-api-access-8ghtq") pod "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" (UID: "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4"). InnerVolumeSpecName "kube-api-access-8ghtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.671258 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6nlkw" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.723524 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" (UID: "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.743581 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" (UID: "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.761113 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.761141 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.761151 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ghtq\" (UniqueName: \"kubernetes.io/projected/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-kube-api-access-8ghtq\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.761164 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.761174 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.761183 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.776584 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xzx8s"] Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.812213 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-config-data" (OuterVolumeSpecName: "config-data") pod "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" (UID: "74c44c4e-3fc5-4e49-ae79-e5123df3bcb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.858514 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="932dfe25-b5f6-4cc6-92f7-6594c8449263" path="/var/lib/kubelet/pods/932dfe25-b5f6-4cc6-92f7-6594c8449263/volumes" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.863608 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.885116 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74c44c4e-3fc5-4e49-ae79-e5123df3bcb4","Type":"ContainerDied","Data":"1a9a67b818009381d2a4826767920304abb28ff0ccd71d34f3c3330eaaeefa0c"} Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.885132 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.885174 4717 scope.go:117] "RemoveContainer" containerID="a0b65f0f7b866f77822127cb3df60c9c782c107ca199b4c83b3b3476d73b31c0" Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.909814 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xzx8s" event={"ID":"062a7610-89de-4af4-a35d-b965eff08320","Type":"ContainerStarted","Data":"5bcbb8410e0552b50aab7c79c09503b482c52d485bfa1859c56e8727cbafbbf2"} Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.914441 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-874f74f55-n92h5" event={"ID":"8206d35f-44b8-45f5-9286-8e4179701b96","Type":"ContainerStarted","Data":"75ecd38d0ac21fe44f1d3b8e10300ebcf3b296182dafca71abebabb406b32521"} Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.914481 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-874f74f55-n92h5" event={"ID":"8206d35f-44b8-45f5-9286-8e4179701b96","Type":"ContainerStarted","Data":"ddea8c5e74402bf09907a379b1a2f7c4b072bdb77aba39a0cc7dc6a9c2ff2553"} Feb 17 15:12:49 crc kubenswrapper[4717]: I0217 15:12:49.946237 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9xpzr"] Feb 17 15:12:49 crc kubenswrapper[4717]: W0217 15:12:49.953277 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd02c1955_1938_4cac_b38e_e1eec7332813.slice/crio-b22faca357f7ddc6488df5704a0982c1dfe9ec90167036134ad33d56ee6b115d WatchSource:0}: Error finding container b22faca357f7ddc6488df5704a0982c1dfe9ec90167036134ad33d56ee6b115d: Status 404 returned error can't find the container with id b22faca357f7ddc6488df5704a0982c1dfe9ec90167036134ad33d56ee6b115d Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.010839 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.026676 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.041750 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:12:50 crc kubenswrapper[4717]: E0217 15:12:50.042172 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="ceilometer-central-agent" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.042185 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="ceilometer-central-agent" Feb 17 15:12:50 crc kubenswrapper[4717]: E0217 15:12:50.042199 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="sg-core" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.042205 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="sg-core" Feb 17 15:12:50 crc kubenswrapper[4717]: E0217 15:12:50.042217 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="ceilometer-notification-agent" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.042223 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="ceilometer-notification-agent" Feb 17 15:12:50 crc kubenswrapper[4717]: E0217 15:12:50.042233 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="proxy-httpd" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.042238 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="proxy-httpd" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.042427 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="proxy-httpd" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.042437 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="ceilometer-central-agent" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.042447 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="sg-core" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.042463 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" containerName="ceilometer-notification-agent" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.044192 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.045157 4717 scope.go:117] "RemoveContainer" containerID="2ac53413e7c6847ab1251601cb05d12b1ad759d3861c77073c818853c34f9522" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.050690 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.050714 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.062745 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.083952 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-scripts\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.084029 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-log-httpd\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.084070 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.084118 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-config-data\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.084313 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.084365 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28xvp\" (UniqueName: \"kubernetes.io/projected/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-kube-api-access-28xvp\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.084437 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-run-httpd\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.088190 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7260-account-create-update-wtt26"] Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.143229 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c075-account-create-update-d8f2k"] Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.160255 4717 scope.go:117] "RemoveContainer" containerID="aa3b8b5ff9855549d1302342083239260b7758d2e94d94537bfac3ec3d903725" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.165966 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bc18-account-create-update-qs9qz"] Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.188107 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-scripts\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.188183 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-log-httpd\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.188223 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.188253 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-config-data\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.188304 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.188324 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28xvp\" (UniqueName: \"kubernetes.io/projected/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-kube-api-access-28xvp\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.188349 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-run-httpd\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.188666 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-log-httpd\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.188717 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-run-httpd\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.196207 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-scripts\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.196771 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.196835 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.197244 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-config-data\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.207787 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28xvp\" (UniqueName: \"kubernetes.io/projected/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-kube-api-access-28xvp\") pod \"ceilometer-0\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.241239 4717 scope.go:117] "RemoveContainer" containerID="0ffa36f3aa17567b5726cc123f55bc87e4712b60fc91f1e4e9e672f4381f5a4c" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.307694 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6nlkw"] Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.390205 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.816471 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.816717 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.939361 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6nlkw" event={"ID":"47393b1e-7c8f-491e-a25a-0ef15e7eef3d","Type":"ContainerStarted","Data":"3434890440c8c96db0a0ff40d92887735467f5ec9d22d7e6d1dcdda7223f0922"} Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.939414 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6nlkw" event={"ID":"47393b1e-7c8f-491e-a25a-0ef15e7eef3d","Type":"ContainerStarted","Data":"751039c395191122a8c62ae14eb71735e2571594dbf5e4272967a31ce6b8560d"} Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.944069 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c075-account-create-update-d8f2k" event={"ID":"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700","Type":"ContainerStarted","Data":"f33c50edd36ecabb200538b62fe7adc36e1722f79d2e1b875b1b36359e991b96"} Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.944126 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c075-account-create-update-d8f2k" event={"ID":"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700","Type":"ContainerStarted","Data":"832ed5baf96a71ac98793568054123bab91c283fbc47b57384d11f4ff7e0b04b"} Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.951997 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9xpzr" event={"ID":"d02c1955-1938-4cac-b38e-e1eec7332813","Type":"ContainerStarted","Data":"8f78b1d79b5cc9a34466bc0d944fec17c7c33792639de76621b324984fd047b1"} Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.952043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9xpzr" event={"ID":"d02c1955-1938-4cac-b38e-e1eec7332813","Type":"ContainerStarted","Data":"b22faca357f7ddc6488df5704a0982c1dfe9ec90167036134ad33d56ee6b115d"} Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.956649 4717 generic.go:334] "Generic (PLEG): container finished" podID="062a7610-89de-4af4-a35d-b965eff08320" containerID="0b48c1cea0ca4bad7a37163fe233403ebe72d23f69a6d0a7a5a08762638b0dd0" exitCode=0 Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.957190 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xzx8s" event={"ID":"062a7610-89de-4af4-a35d-b965eff08320","Type":"ContainerDied","Data":"0b48c1cea0ca4bad7a37163fe233403ebe72d23f69a6d0a7a5a08762638b0dd0"} Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.961002 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-6nlkw" podStartSLOduration=2.9609885179999997 podStartE2EDuration="2.960988518s" podCreationTimestamp="2026-02-17 15:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:50.958362003 +0000 UTC m=+1237.374202479" watchObservedRunningTime="2026-02-17 15:12:50.960988518 +0000 UTC m=+1237.376828994" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.963535 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7260-account-create-update-wtt26" event={"ID":"70f67dea-44d1-487d-9981-d758db840579","Type":"ContainerStarted","Data":"23b0af1a8bdcb765fc916a7eeaabd1df7b569556c9185090f4dbdfde1ed35f1d"} Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.963598 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7260-account-create-update-wtt26" event={"ID":"70f67dea-44d1-487d-9981-d758db840579","Type":"ContainerStarted","Data":"6d3292131134cc8ead2517c9ca86792e45eb581ef518fbd9990bc3b6d2514aa9"} Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.969365 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-874f74f55-n92h5" event={"ID":"8206d35f-44b8-45f5-9286-8e4179701b96","Type":"ContainerStarted","Data":"c49f993245c56604c1af94202526d4d6d9f892ae079341ebb6ed179b48e34f3e"} Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.969617 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.969749 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.987230 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" event={"ID":"7970d67a-9fb0-493c-a143-bc2fee1d4c08","Type":"ContainerStarted","Data":"b7ce7642ceea57a15c95230a894cb82d9521e86a44a2c27a5c55ef79a0b1a0dc"} Feb 17 15:12:50 crc kubenswrapper[4717]: I0217 15:12:50.987294 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" event={"ID":"7970d67a-9fb0-493c-a143-bc2fee1d4c08","Type":"ContainerStarted","Data":"26c88c25b3954d61527a90d9ae793bbf1f120946d39dfeca204a04dae9917d2c"} Feb 17 15:12:51 crc kubenswrapper[4717]: I0217 15:12:51.001439 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-c075-account-create-update-d8f2k" podStartSLOduration=3.001341243 podStartE2EDuration="3.001341243s" podCreationTimestamp="2026-02-17 15:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:50.993198872 +0000 UTC m=+1237.409039348" watchObservedRunningTime="2026-02-17 15:12:51.001341243 +0000 UTC m=+1237.417181729" Feb 17 15:12:51 crc kubenswrapper[4717]: I0217 15:12:51.017871 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-9xpzr" podStartSLOduration=3.017851131 podStartE2EDuration="3.017851131s" podCreationTimestamp="2026-02-17 15:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:51.006350325 +0000 UTC m=+1237.422190801" watchObservedRunningTime="2026-02-17 15:12:51.017851131 +0000 UTC m=+1237.433691617" Feb 17 15:12:51 crc kubenswrapper[4717]: I0217 15:12:51.097884 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-7260-account-create-update-wtt26" podStartSLOduration=2.097858751 podStartE2EDuration="2.097858751s" podCreationTimestamp="2026-02-17 15:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:51.017698807 +0000 UTC m=+1237.433539283" watchObservedRunningTime="2026-02-17 15:12:51.097858751 +0000 UTC m=+1237.513699227" Feb 17 15:12:51 crc kubenswrapper[4717]: I0217 15:12:51.099301 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" podStartSLOduration=2.099295022 podStartE2EDuration="2.099295022s" podCreationTimestamp="2026-02-17 15:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:51.066334937 +0000 UTC m=+1237.482175413" watchObservedRunningTime="2026-02-17 15:12:51.099295022 +0000 UTC m=+1237.515135498" Feb 17 15:12:51 crc kubenswrapper[4717]: I0217 15:12:51.116124 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-874f74f55-n92h5" podStartSLOduration=3.116100379 podStartE2EDuration="3.116100379s" podCreationTimestamp="2026-02-17 15:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:12:51.090521683 +0000 UTC m=+1237.506362169" watchObservedRunningTime="2026-02-17 15:12:51.116100379 +0000 UTC m=+1237.531940865" Feb 17 15:12:51 crc kubenswrapper[4717]: I0217 15:12:51.859978 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c44c4e-3fc5-4e49-ae79-e5123df3bcb4" path="/var/lib/kubelet/pods/74c44c4e-3fc5-4e49-ae79-e5123df3bcb4/volumes" Feb 17 15:12:52 crc kubenswrapper[4717]: I0217 15:12:51.997130 4717 generic.go:334] "Generic (PLEG): container finished" podID="7970d67a-9fb0-493c-a143-bc2fee1d4c08" containerID="b7ce7642ceea57a15c95230a894cb82d9521e86a44a2c27a5c55ef79a0b1a0dc" exitCode=0 Feb 17 15:12:52 crc kubenswrapper[4717]: I0217 15:12:51.997187 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" event={"ID":"7970d67a-9fb0-493c-a143-bc2fee1d4c08","Type":"ContainerDied","Data":"b7ce7642ceea57a15c95230a894cb82d9521e86a44a2c27a5c55ef79a0b1a0dc"} Feb 17 15:12:52 crc kubenswrapper[4717]: I0217 15:12:51.999684 4717 generic.go:334] "Generic (PLEG): container finished" podID="47393b1e-7c8f-491e-a25a-0ef15e7eef3d" containerID="3434890440c8c96db0a0ff40d92887735467f5ec9d22d7e6d1dcdda7223f0922" exitCode=0 Feb 17 15:12:52 crc kubenswrapper[4717]: I0217 15:12:51.999813 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6nlkw" event={"ID":"47393b1e-7c8f-491e-a25a-0ef15e7eef3d","Type":"ContainerDied","Data":"3434890440c8c96db0a0ff40d92887735467f5ec9d22d7e6d1dcdda7223f0922"} Feb 17 15:12:52 crc kubenswrapper[4717]: I0217 15:12:52.006155 4717 generic.go:334] "Generic (PLEG): container finished" podID="4dd76b94-c071-4cae-8a3b-f1ef5a3ed700" containerID="f33c50edd36ecabb200538b62fe7adc36e1722f79d2e1b875b1b36359e991b96" exitCode=0 Feb 17 15:12:52 crc kubenswrapper[4717]: I0217 15:12:52.006202 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c075-account-create-update-d8f2k" event={"ID":"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700","Type":"ContainerDied","Data":"f33c50edd36ecabb200538b62fe7adc36e1722f79d2e1b875b1b36359e991b96"} Feb 17 15:12:52 crc kubenswrapper[4717]: I0217 15:12:52.008849 4717 generic.go:334] "Generic (PLEG): container finished" podID="d02c1955-1938-4cac-b38e-e1eec7332813" containerID="8f78b1d79b5cc9a34466bc0d944fec17c7c33792639de76621b324984fd047b1" exitCode=0 Feb 17 15:12:52 crc kubenswrapper[4717]: I0217 15:12:52.008906 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9xpzr" event={"ID":"d02c1955-1938-4cac-b38e-e1eec7332813","Type":"ContainerDied","Data":"8f78b1d79b5cc9a34466bc0d944fec17c7c33792639de76621b324984fd047b1"} Feb 17 15:12:52 crc kubenswrapper[4717]: I0217 15:12:52.012777 4717 generic.go:334] "Generic (PLEG): container finished" podID="70f67dea-44d1-487d-9981-d758db840579" containerID="23b0af1a8bdcb765fc916a7eeaabd1df7b569556c9185090f4dbdfde1ed35f1d" exitCode=0 Feb 17 15:12:52 crc kubenswrapper[4717]: I0217 15:12:52.012901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7260-account-create-update-wtt26" event={"ID":"70f67dea-44d1-487d-9981-d758db840579","Type":"ContainerDied","Data":"23b0af1a8bdcb765fc916a7eeaabd1df7b569556c9185090f4dbdfde1ed35f1d"} Feb 17 15:12:54 crc kubenswrapper[4717]: I0217 15:12:54.134555 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b696c957f-jwh8l" Feb 17 15:12:54 crc kubenswrapper[4717]: I0217 15:12:54.242036 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6d69c5bc8d-pt8k6"] Feb 17 15:12:54 crc kubenswrapper[4717]: I0217 15:12:54.248274 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6d69c5bc8d-pt8k6" podUID="a36880ff-5e50-413f-8213-79ed16bed713" containerName="neutron-api" containerID="cri-o://5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277" gracePeriod=30 Feb 17 15:12:54 crc kubenswrapper[4717]: I0217 15:12:54.248711 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6d69c5bc8d-pt8k6" podUID="a36880ff-5e50-413f-8213-79ed16bed713" containerName="neutron-httpd" containerID="cri-o://950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e" gracePeriod=30 Feb 17 15:12:54 crc kubenswrapper[4717]: I0217 15:12:54.656869 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 15:12:55 crc kubenswrapper[4717]: I0217 15:12:55.016430 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b88fd5cc6-dqjmc" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Feb 17 15:12:55 crc kubenswrapper[4717]: I0217 15:12:55.052123 4717 generic.go:334] "Generic (PLEG): container finished" podID="a36880ff-5e50-413f-8213-79ed16bed713" containerID="950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e" exitCode=0 Feb 17 15:12:55 crc kubenswrapper[4717]: I0217 15:12:55.052215 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d69c5bc8d-pt8k6" event={"ID":"a36880ff-5e50-413f-8213-79ed16bed713","Type":"ContainerDied","Data":"950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e"} Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.825313 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9xpzr" Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.835827 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.868501 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xzx8s" Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.890348 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c075-account-create-update-d8f2k" Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.897061 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7970d67a-9fb0-493c-a143-bc2fee1d4c08-operator-scripts\") pod \"7970d67a-9fb0-493c-a143-bc2fee1d4c08\" (UID: \"7970d67a-9fb0-493c-a143-bc2fee1d4c08\") " Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.897390 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq8s6\" (UniqueName: \"kubernetes.io/projected/d02c1955-1938-4cac-b38e-e1eec7332813-kube-api-access-pq8s6\") pod \"d02c1955-1938-4cac-b38e-e1eec7332813\" (UID: \"d02c1955-1938-4cac-b38e-e1eec7332813\") " Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.897430 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtrb2\" (UniqueName: \"kubernetes.io/projected/7970d67a-9fb0-493c-a143-bc2fee1d4c08-kube-api-access-jtrb2\") pod \"7970d67a-9fb0-493c-a143-bc2fee1d4c08\" (UID: \"7970d67a-9fb0-493c-a143-bc2fee1d4c08\") " Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.897503 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02c1955-1938-4cac-b38e-e1eec7332813-operator-scripts\") pod \"d02c1955-1938-4cac-b38e-e1eec7332813\" (UID: \"d02c1955-1938-4cac-b38e-e1eec7332813\") " Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.905072 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02c1955-1938-4cac-b38e-e1eec7332813-kube-api-access-pq8s6" (OuterVolumeSpecName: "kube-api-access-pq8s6") pod "d02c1955-1938-4cac-b38e-e1eec7332813" (UID: "d02c1955-1938-4cac-b38e-e1eec7332813"). InnerVolumeSpecName "kube-api-access-pq8s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.906342 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02c1955-1938-4cac-b38e-e1eec7332813-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d02c1955-1938-4cac-b38e-e1eec7332813" (UID: "d02c1955-1938-4cac-b38e-e1eec7332813"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.906973 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7970d67a-9fb0-493c-a143-bc2fee1d4c08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7970d67a-9fb0-493c-a143-bc2fee1d4c08" (UID: "7970d67a-9fb0-493c-a143-bc2fee1d4c08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.907707 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6nlkw" Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.908774 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq8s6\" (UniqueName: \"kubernetes.io/projected/d02c1955-1938-4cac-b38e-e1eec7332813-kube-api-access-pq8s6\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.908808 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02c1955-1938-4cac-b38e-e1eec7332813-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.908829 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7970d67a-9fb0-493c-a143-bc2fee1d4c08-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.940328 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7970d67a-9fb0-493c-a143-bc2fee1d4c08-kube-api-access-jtrb2" (OuterVolumeSpecName: "kube-api-access-jtrb2") pod "7970d67a-9fb0-493c-a143-bc2fee1d4c08" (UID: "7970d67a-9fb0-493c-a143-bc2fee1d4c08"). InnerVolumeSpecName "kube-api-access-jtrb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:57 crc kubenswrapper[4717]: I0217 15:12:57.978099 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7260-account-create-update-wtt26" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.009498 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9s8q\" (UniqueName: \"kubernetes.io/projected/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700-kube-api-access-q9s8q\") pod \"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700\" (UID: \"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.009624 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/062a7610-89de-4af4-a35d-b965eff08320-operator-scripts\") pod \"062a7610-89de-4af4-a35d-b965eff08320\" (UID: \"062a7610-89de-4af4-a35d-b965eff08320\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.009666 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8shn\" (UniqueName: \"kubernetes.io/projected/062a7610-89de-4af4-a35d-b965eff08320-kube-api-access-w8shn\") pod \"062a7610-89de-4af4-a35d-b965eff08320\" (UID: \"062a7610-89de-4af4-a35d-b965eff08320\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.009696 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzrc9\" (UniqueName: \"kubernetes.io/projected/47393b1e-7c8f-491e-a25a-0ef15e7eef3d-kube-api-access-bzrc9\") pod \"47393b1e-7c8f-491e-a25a-0ef15e7eef3d\" (UID: \"47393b1e-7c8f-491e-a25a-0ef15e7eef3d\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.009726 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700-operator-scripts\") pod \"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700\" (UID: \"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.009750 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47393b1e-7c8f-491e-a25a-0ef15e7eef3d-operator-scripts\") pod \"47393b1e-7c8f-491e-a25a-0ef15e7eef3d\" (UID: \"47393b1e-7c8f-491e-a25a-0ef15e7eef3d\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.010185 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtrb2\" (UniqueName: \"kubernetes.io/projected/7970d67a-9fb0-493c-a143-bc2fee1d4c08-kube-api-access-jtrb2\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.010924 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47393b1e-7c8f-491e-a25a-0ef15e7eef3d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47393b1e-7c8f-491e-a25a-0ef15e7eef3d" (UID: "47393b1e-7c8f-491e-a25a-0ef15e7eef3d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.014654 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/062a7610-89de-4af4-a35d-b965eff08320-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "062a7610-89de-4af4-a35d-b965eff08320" (UID: "062a7610-89de-4af4-a35d-b965eff08320"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.014746 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4dd76b94-c071-4cae-8a3b-f1ef5a3ed700" (UID: "4dd76b94-c071-4cae-8a3b-f1ef5a3ed700"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.045886 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700-kube-api-access-q9s8q" (OuterVolumeSpecName: "kube-api-access-q9s8q") pod "4dd76b94-c071-4cae-8a3b-f1ef5a3ed700" (UID: "4dd76b94-c071-4cae-8a3b-f1ef5a3ed700"). InnerVolumeSpecName "kube-api-access-q9s8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.046595 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47393b1e-7c8f-491e-a25a-0ef15e7eef3d-kube-api-access-bzrc9" (OuterVolumeSpecName: "kube-api-access-bzrc9") pod "47393b1e-7c8f-491e-a25a-0ef15e7eef3d" (UID: "47393b1e-7c8f-491e-a25a-0ef15e7eef3d"). InnerVolumeSpecName "kube-api-access-bzrc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.047936 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/062a7610-89de-4af4-a35d-b965eff08320-kube-api-access-w8shn" (OuterVolumeSpecName: "kube-api-access-w8shn") pod "062a7610-89de-4af4-a35d-b965eff08320" (UID: "062a7610-89de-4af4-a35d-b965eff08320"). InnerVolumeSpecName "kube-api-access-w8shn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.060489 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.082224 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9xpzr" event={"ID":"d02c1955-1938-4cac-b38e-e1eec7332813","Type":"ContainerDied","Data":"b22faca357f7ddc6488df5704a0982c1dfe9ec90167036134ad33d56ee6b115d"} Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.082275 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b22faca357f7ddc6488df5704a0982c1dfe9ec90167036134ad33d56ee6b115d" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.082372 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9xpzr" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.091661 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" event={"ID":"7970d67a-9fb0-493c-a143-bc2fee1d4c08","Type":"ContainerDied","Data":"26c88c25b3954d61527a90d9ae793bbf1f120946d39dfeca204a04dae9917d2c"} Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.091707 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26c88c25b3954d61527a90d9ae793bbf1f120946d39dfeca204a04dae9917d2c" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.091803 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bc18-account-create-update-qs9qz" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.111740 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f67dea-44d1-487d-9981-d758db840579-operator-scripts\") pod \"70f67dea-44d1-487d-9981-d758db840579\" (UID: \"70f67dea-44d1-487d-9981-d758db840579\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.111817 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-config\") pod \"a36880ff-5e50-413f-8213-79ed16bed713\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.111847 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cbtd\" (UniqueName: \"kubernetes.io/projected/a36880ff-5e50-413f-8213-79ed16bed713-kube-api-access-5cbtd\") pod \"a36880ff-5e50-413f-8213-79ed16bed713\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.111943 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-ovndb-tls-certs\") pod \"a36880ff-5e50-413f-8213-79ed16bed713\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.111994 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-httpd-config\") pod \"a36880ff-5e50-413f-8213-79ed16bed713\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.112060 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8wpx\" (UniqueName: \"kubernetes.io/projected/70f67dea-44d1-487d-9981-d758db840579-kube-api-access-b8wpx\") pod \"70f67dea-44d1-487d-9981-d758db840579\" (UID: \"70f67dea-44d1-487d-9981-d758db840579\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.112136 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-combined-ca-bundle\") pod \"a36880ff-5e50-413f-8213-79ed16bed713\" (UID: \"a36880ff-5e50-413f-8213-79ed16bed713\") " Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.112476 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8shn\" (UniqueName: \"kubernetes.io/projected/062a7610-89de-4af4-a35d-b965eff08320-kube-api-access-w8shn\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.112492 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzrc9\" (UniqueName: \"kubernetes.io/projected/47393b1e-7c8f-491e-a25a-0ef15e7eef3d-kube-api-access-bzrc9\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.112502 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.112511 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47393b1e-7c8f-491e-a25a-0ef15e7eef3d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.112519 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9s8q\" (UniqueName: \"kubernetes.io/projected/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700-kube-api-access-q9s8q\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.112530 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/062a7610-89de-4af4-a35d-b965eff08320-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.114843 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f67dea-44d1-487d-9981-d758db840579-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70f67dea-44d1-487d-9981-d758db840579" (UID: "70f67dea-44d1-487d-9981-d758db840579"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.114975 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6nlkw" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.115012 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6nlkw" event={"ID":"47393b1e-7c8f-491e-a25a-0ef15e7eef3d","Type":"ContainerDied","Data":"751039c395191122a8c62ae14eb71735e2571594dbf5e4272967a31ce6b8560d"} Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.115098 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="751039c395191122a8c62ae14eb71735e2571594dbf5e4272967a31ce6b8560d" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.117749 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8cf42885-3509-4779-901c-e88f11c5fdfd","Type":"ContainerStarted","Data":"ee0f9a16be5478985852c4731901ae362172217d8a30fe1cc3595db36d4ebc82"} Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.120442 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36880ff-5e50-413f-8213-79ed16bed713-kube-api-access-5cbtd" (OuterVolumeSpecName: "kube-api-access-5cbtd") pod "a36880ff-5e50-413f-8213-79ed16bed713" (UID: "a36880ff-5e50-413f-8213-79ed16bed713"). InnerVolumeSpecName "kube-api-access-5cbtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.122116 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f67dea-44d1-487d-9981-d758db840579-kube-api-access-b8wpx" (OuterVolumeSpecName: "kube-api-access-b8wpx") pod "70f67dea-44d1-487d-9981-d758db840579" (UID: "70f67dea-44d1-487d-9981-d758db840579"). InnerVolumeSpecName "kube-api-access-b8wpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.125984 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a36880ff-5e50-413f-8213-79ed16bed713" (UID: "a36880ff-5e50-413f-8213-79ed16bed713"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.126598 4717 generic.go:334] "Generic (PLEG): container finished" podID="a36880ff-5e50-413f-8213-79ed16bed713" containerID="5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277" exitCode=0 Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.126668 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d69c5bc8d-pt8k6" event={"ID":"a36880ff-5e50-413f-8213-79ed16bed713","Type":"ContainerDied","Data":"5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277"} Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.126703 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d69c5bc8d-pt8k6" event={"ID":"a36880ff-5e50-413f-8213-79ed16bed713","Type":"ContainerDied","Data":"3c70c371a8e8e6829c87476fdc229459c4c1ce572d323d5ad85cf70bf024746e"} Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.126724 4717 scope.go:117] "RemoveContainer" containerID="950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.126728 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d69c5bc8d-pt8k6" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.130724 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xzx8s" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.130792 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xzx8s" event={"ID":"062a7610-89de-4af4-a35d-b965eff08320","Type":"ContainerDied","Data":"5bcbb8410e0552b50aab7c79c09503b482c52d485bfa1859c56e8727cbafbbf2"} Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.130830 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bcbb8410e0552b50aab7c79c09503b482c52d485bfa1859c56e8727cbafbbf2" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.144308 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.7036240510000003 podStartE2EDuration="18.144291142s" podCreationTimestamp="2026-02-17 15:12:40 +0000 UTC" firstStartedPulling="2026-02-17 15:12:42.290453166 +0000 UTC m=+1228.706293632" lastFinishedPulling="2026-02-17 15:12:57.731120247 +0000 UTC m=+1244.146960723" observedRunningTime="2026-02-17 15:12:58.13540922 +0000 UTC m=+1244.551249696" watchObservedRunningTime="2026-02-17 15:12:58.144291142 +0000 UTC m=+1244.560131618" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.150897 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c075-account-create-update-d8f2k" event={"ID":"4dd76b94-c071-4cae-8a3b-f1ef5a3ed700","Type":"ContainerDied","Data":"832ed5baf96a71ac98793568054123bab91c283fbc47b57384d11f4ff7e0b04b"} Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.150932 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="832ed5baf96a71ac98793568054123bab91c283fbc47b57384d11f4ff7e0b04b" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.150992 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c075-account-create-update-d8f2k" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.162071 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7260-account-create-update-wtt26" event={"ID":"70f67dea-44d1-487d-9981-d758db840579","Type":"ContainerDied","Data":"6d3292131134cc8ead2517c9ca86792e45eb581ef518fbd9990bc3b6d2514aa9"} Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.162167 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d3292131134cc8ead2517c9ca86792e45eb581ef518fbd9990bc3b6d2514aa9" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.162217 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7260-account-create-update-wtt26" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.178070 4717 scope.go:117] "RemoveContainer" containerID="5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.199385 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a36880ff-5e50-413f-8213-79ed16bed713" (UID: "a36880ff-5e50-413f-8213-79ed16bed713"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.208227 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-config" (OuterVolumeSpecName: "config") pod "a36880ff-5e50-413f-8213-79ed16bed713" (UID: "a36880ff-5e50-413f-8213-79ed16bed713"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.211626 4717 scope.go:117] "RemoveContainer" containerID="950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e" Feb 17 15:12:58 crc kubenswrapper[4717]: E0217 15:12:58.212339 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e\": container with ID starting with 950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e not found: ID does not exist" containerID="950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.212375 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e"} err="failed to get container status \"950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e\": rpc error: code = NotFound desc = could not find container \"950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e\": container with ID starting with 950b453f422105e6d4c5bd43c773473128422b77ea8f7f8130ae02f06bbbfc0e not found: ID does not exist" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.212396 4717 scope.go:117] "RemoveContainer" containerID="5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277" Feb 17 15:12:58 crc kubenswrapper[4717]: E0217 15:12:58.212814 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277\": container with ID starting with 5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277 not found: ID does not exist" containerID="5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.212838 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277"} err="failed to get container status \"5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277\": rpc error: code = NotFound desc = could not find container \"5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277\": container with ID starting with 5541665b240cdde42870736a49b532ccdce048b2a843979e950cb3de2341b277 not found: ID does not exist" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.213757 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.213785 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f67dea-44d1-487d-9981-d758db840579-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.213795 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.213804 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cbtd\" (UniqueName: \"kubernetes.io/projected/a36880ff-5e50-413f-8213-79ed16bed713-kube-api-access-5cbtd\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.213814 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.213824 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8wpx\" (UniqueName: \"kubernetes.io/projected/70f67dea-44d1-487d-9981-d758db840579-kube-api-access-b8wpx\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.227309 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a36880ff-5e50-413f-8213-79ed16bed713" (UID: "a36880ff-5e50-413f-8213-79ed16bed713"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.259457 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.316245 4717 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36880ff-5e50-413f-8213-79ed16bed713-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.473110 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6d69c5bc8d-pt8k6"] Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.482863 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6d69c5bc8d-pt8k6"] Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.755971 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:58 crc kubenswrapper[4717]: I0217 15:12:58.758004 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-874f74f55-n92h5" Feb 17 15:12:59 crc kubenswrapper[4717]: I0217 15:12:59.173928 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81029e7d-2f38-4925-a5d2-26d5dfa86fdd","Type":"ContainerStarted","Data":"1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713"} Feb 17 15:12:59 crc kubenswrapper[4717]: I0217 15:12:59.173962 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81029e7d-2f38-4925-a5d2-26d5dfa86fdd","Type":"ContainerStarted","Data":"ea056569c0be616a361a60aa302566765219aa36fda6521f9c4a870f9a792edf"} Feb 17 15:12:59 crc kubenswrapper[4717]: I0217 15:12:59.883007 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a36880ff-5e50-413f-8213-79ed16bed713" path="/var/lib/kubelet/pods/a36880ff-5e50-413f-8213-79ed16bed713/volumes" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.238656 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.239130 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81029e7d-2f38-4925-a5d2-26d5dfa86fdd","Type":"ContainerStarted","Data":"d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08"} Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.239584 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81029e7d-2f38-4925-a5d2-26d5dfa86fdd","Type":"ContainerStarted","Data":"c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac"} Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.246197 4717 generic.go:334] "Generic (PLEG): container finished" podID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerID="d8519b0f4b147248ba4d6f91a993baf664f8ea8a630bc4159f522032d1ffea67" exitCode=137 Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.246237 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b88fd5cc6-dqjmc" event={"ID":"946c8c31-01d1-45f7-87c2-a022100aeef9","Type":"ContainerDied","Data":"d8519b0f4b147248ba4d6f91a993baf664f8ea8a630bc4159f522032d1ffea67"} Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.246265 4717 scope.go:117] "RemoveContainer" containerID="de13a9e87b8b1b1396a6b56df5eb0db2cb23cd17ad0fd1fa2115022517f5d512" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.370474 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-horizon-tls-certs\") pod \"946c8c31-01d1-45f7-87c2-a022100aeef9\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.370532 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/946c8c31-01d1-45f7-87c2-a022100aeef9-config-data\") pod \"946c8c31-01d1-45f7-87c2-a022100aeef9\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.370557 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wf4p\" (UniqueName: \"kubernetes.io/projected/946c8c31-01d1-45f7-87c2-a022100aeef9-kube-api-access-4wf4p\") pod \"946c8c31-01d1-45f7-87c2-a022100aeef9\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.370625 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/946c8c31-01d1-45f7-87c2-a022100aeef9-logs\") pod \"946c8c31-01d1-45f7-87c2-a022100aeef9\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.370710 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-horizon-secret-key\") pod \"946c8c31-01d1-45f7-87c2-a022100aeef9\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.370753 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-combined-ca-bundle\") pod \"946c8c31-01d1-45f7-87c2-a022100aeef9\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.370781 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/946c8c31-01d1-45f7-87c2-a022100aeef9-scripts\") pod \"946c8c31-01d1-45f7-87c2-a022100aeef9\" (UID: \"946c8c31-01d1-45f7-87c2-a022100aeef9\") " Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.371786 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/946c8c31-01d1-45f7-87c2-a022100aeef9-logs" (OuterVolumeSpecName: "logs") pod "946c8c31-01d1-45f7-87c2-a022100aeef9" (UID: "946c8c31-01d1-45f7-87c2-a022100aeef9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.380255 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "946c8c31-01d1-45f7-87c2-a022100aeef9" (UID: "946c8c31-01d1-45f7-87c2-a022100aeef9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.381315 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/946c8c31-01d1-45f7-87c2-a022100aeef9-kube-api-access-4wf4p" (OuterVolumeSpecName: "kube-api-access-4wf4p") pod "946c8c31-01d1-45f7-87c2-a022100aeef9" (UID: "946c8c31-01d1-45f7-87c2-a022100aeef9"). InnerVolumeSpecName "kube-api-access-4wf4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.401241 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/946c8c31-01d1-45f7-87c2-a022100aeef9-config-data" (OuterVolumeSpecName: "config-data") pod "946c8c31-01d1-45f7-87c2-a022100aeef9" (UID: "946c8c31-01d1-45f7-87c2-a022100aeef9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.401779 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "946c8c31-01d1-45f7-87c2-a022100aeef9" (UID: "946c8c31-01d1-45f7-87c2-a022100aeef9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.418790 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/946c8c31-01d1-45f7-87c2-a022100aeef9-scripts" (OuterVolumeSpecName: "scripts") pod "946c8c31-01d1-45f7-87c2-a022100aeef9" (UID: "946c8c31-01d1-45f7-87c2-a022100aeef9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.433172 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "946c8c31-01d1-45f7-87c2-a022100aeef9" (UID: "946c8c31-01d1-45f7-87c2-a022100aeef9"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.451036 4717 scope.go:117] "RemoveContainer" containerID="d8519b0f4b147248ba4d6f91a993baf664f8ea8a630bc4159f522032d1ffea67" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.472692 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.472747 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/946c8c31-01d1-45f7-87c2-a022100aeef9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.472760 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wf4p\" (UniqueName: \"kubernetes.io/projected/946c8c31-01d1-45f7-87c2-a022100aeef9-kube-api-access-4wf4p\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.472775 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/946c8c31-01d1-45f7-87c2-a022100aeef9-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.472788 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.472800 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/946c8c31-01d1-45f7-87c2-a022100aeef9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:00 crc kubenswrapper[4717]: I0217 15:13:00.472810 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/946c8c31-01d1-45f7-87c2-a022100aeef9-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:01 crc kubenswrapper[4717]: I0217 15:13:01.260466 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b88fd5cc6-dqjmc" event={"ID":"946c8c31-01d1-45f7-87c2-a022100aeef9","Type":"ContainerDied","Data":"8622e1417c766e9d3f34c004d62d9076982ac31c050af9355bb76223d14efdaf"} Feb 17 15:13:01 crc kubenswrapper[4717]: I0217 15:13:01.260489 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b88fd5cc6-dqjmc" Feb 17 15:13:01 crc kubenswrapper[4717]: I0217 15:13:01.301036 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b88fd5cc6-dqjmc"] Feb 17 15:13:01 crc kubenswrapper[4717]: I0217 15:13:01.310121 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b88fd5cc6-dqjmc"] Feb 17 15:13:01 crc kubenswrapper[4717]: I0217 15:13:01.857229 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" path="/var/lib/kubelet/pods/946c8c31-01d1-45f7-87c2-a022100aeef9/volumes" Feb 17 15:13:01 crc kubenswrapper[4717]: I0217 15:13:01.877911 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.013691 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-combined-ca-bundle\") pod \"d8363289-fd40-402e-a82d-2d7954bdca28\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.013817 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-scripts\") pod \"d8363289-fd40-402e-a82d-2d7954bdca28\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.013844 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8363289-fd40-402e-a82d-2d7954bdca28-logs\") pod \"d8363289-fd40-402e-a82d-2d7954bdca28\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.013865 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-config-data\") pod \"d8363289-fd40-402e-a82d-2d7954bdca28\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.013951 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlsfh\" (UniqueName: \"kubernetes.io/projected/d8363289-fd40-402e-a82d-2d7954bdca28-kube-api-access-dlsfh\") pod \"d8363289-fd40-402e-a82d-2d7954bdca28\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.013979 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8363289-fd40-402e-a82d-2d7954bdca28-etc-machine-id\") pod \"d8363289-fd40-402e-a82d-2d7954bdca28\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.014014 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-config-data-custom\") pod \"d8363289-fd40-402e-a82d-2d7954bdca28\" (UID: \"d8363289-fd40-402e-a82d-2d7954bdca28\") " Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.015710 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8363289-fd40-402e-a82d-2d7954bdca28-logs" (OuterVolumeSpecName: "logs") pod "d8363289-fd40-402e-a82d-2d7954bdca28" (UID: "d8363289-fd40-402e-a82d-2d7954bdca28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.016198 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8363289-fd40-402e-a82d-2d7954bdca28-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d8363289-fd40-402e-a82d-2d7954bdca28" (UID: "d8363289-fd40-402e-a82d-2d7954bdca28"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.020559 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-scripts" (OuterVolumeSpecName: "scripts") pod "d8363289-fd40-402e-a82d-2d7954bdca28" (UID: "d8363289-fd40-402e-a82d-2d7954bdca28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.020601 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d8363289-fd40-402e-a82d-2d7954bdca28" (UID: "d8363289-fd40-402e-a82d-2d7954bdca28"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.021071 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8363289-fd40-402e-a82d-2d7954bdca28-kube-api-access-dlsfh" (OuterVolumeSpecName: "kube-api-access-dlsfh") pod "d8363289-fd40-402e-a82d-2d7954bdca28" (UID: "d8363289-fd40-402e-a82d-2d7954bdca28"). InnerVolumeSpecName "kube-api-access-dlsfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.046732 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8363289-fd40-402e-a82d-2d7954bdca28" (UID: "d8363289-fd40-402e-a82d-2d7954bdca28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.067394 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-config-data" (OuterVolumeSpecName: "config-data") pod "d8363289-fd40-402e-a82d-2d7954bdca28" (UID: "d8363289-fd40-402e-a82d-2d7954bdca28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.115741 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.115769 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.115780 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8363289-fd40-402e-a82d-2d7954bdca28-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.115789 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.115797 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlsfh\" (UniqueName: \"kubernetes.io/projected/d8363289-fd40-402e-a82d-2d7954bdca28-kube-api-access-dlsfh\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.115807 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8363289-fd40-402e-a82d-2d7954bdca28-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.115815 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8363289-fd40-402e-a82d-2d7954bdca28-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.272786 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8363289-fd40-402e-a82d-2d7954bdca28" containerID="e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829" exitCode=137 Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.272884 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.272874 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8363289-fd40-402e-a82d-2d7954bdca28","Type":"ContainerDied","Data":"e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829"} Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.273017 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d8363289-fd40-402e-a82d-2d7954bdca28","Type":"ContainerDied","Data":"8d76ba10e8e873bb2f265272d2312cf0927574e9f0acb746a45df6a8bcf78edd"} Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.273050 4717 scope.go:117] "RemoveContainer" containerID="e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.275849 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81029e7d-2f38-4925-a5d2-26d5dfa86fdd","Type":"ContainerStarted","Data":"7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab"} Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.276814 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.299672 4717 scope.go:117] "RemoveContainer" containerID="89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.363273 4717 scope.go:117] "RemoveContainer" containerID="e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.363753 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829\": container with ID starting with e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829 not found: ID does not exist" containerID="e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.363783 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829"} err="failed to get container status \"e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829\": rpc error: code = NotFound desc = could not find container \"e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829\": container with ID starting with e5a2509363fd08a840e52d0cdc05d3e242a54087f4d0e8246ca9c120213b2829 not found: ID does not exist" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.363815 4717 scope.go:117] "RemoveContainer" containerID="89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.368359 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718\": container with ID starting with 89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718 not found: ID does not exist" containerID="89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.368398 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718"} err="failed to get container status \"89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718\": rpc error: code = NotFound desc = could not find container \"89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718\": container with ID starting with 89443e2e661dfca91f918a68de2a8aa98717a7a858c9a326b111773c1b645718 not found: ID does not exist" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.377199 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=10.050367023 podStartE2EDuration="13.377165814s" podCreationTimestamp="2026-02-17 15:12:49 +0000 UTC" firstStartedPulling="2026-02-17 15:12:58.260640783 +0000 UTC m=+1244.676481259" lastFinishedPulling="2026-02-17 15:13:01.587439574 +0000 UTC m=+1248.003280050" observedRunningTime="2026-02-17 15:13:02.334975997 +0000 UTC m=+1248.750816493" watchObservedRunningTime="2026-02-17 15:13:02.377165814 +0000 UTC m=+1248.793006300" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.388590 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.398687 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416251 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.416630 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416647 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.416661 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062a7610-89de-4af4-a35d-b965eff08320" containerName="mariadb-database-create" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416668 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="062a7610-89de-4af4-a35d-b965eff08320" containerName="mariadb-database-create" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.416680 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon-log" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416686 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon-log" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.416695 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36880ff-5e50-413f-8213-79ed16bed713" containerName="neutron-api" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416701 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36880ff-5e50-413f-8213-79ed16bed713" containerName="neutron-api" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.416708 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7970d67a-9fb0-493c-a143-bc2fee1d4c08" containerName="mariadb-account-create-update" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416714 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7970d67a-9fb0-493c-a143-bc2fee1d4c08" containerName="mariadb-account-create-update" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.416724 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f67dea-44d1-487d-9981-d758db840579" containerName="mariadb-account-create-update" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416729 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f67dea-44d1-487d-9981-d758db840579" containerName="mariadb-account-create-update" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.416746 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8363289-fd40-402e-a82d-2d7954bdca28" containerName="cinder-api" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416752 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8363289-fd40-402e-a82d-2d7954bdca28" containerName="cinder-api" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.416765 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02c1955-1938-4cac-b38e-e1eec7332813" containerName="mariadb-database-create" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416770 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02c1955-1938-4cac-b38e-e1eec7332813" containerName="mariadb-database-create" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.416777 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36880ff-5e50-413f-8213-79ed16bed713" containerName="neutron-httpd" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416783 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36880ff-5e50-413f-8213-79ed16bed713" containerName="neutron-httpd" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.416794 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd76b94-c071-4cae-8a3b-f1ef5a3ed700" containerName="mariadb-account-create-update" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416799 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd76b94-c071-4cae-8a3b-f1ef5a3ed700" containerName="mariadb-account-create-update" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.416806 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47393b1e-7c8f-491e-a25a-0ef15e7eef3d" containerName="mariadb-database-create" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416812 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="47393b1e-7c8f-491e-a25a-0ef15e7eef3d" containerName="mariadb-database-create" Feb 17 15:13:02 crc kubenswrapper[4717]: E0217 15:13:02.416824 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8363289-fd40-402e-a82d-2d7954bdca28" containerName="cinder-api-log" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.416831 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8363289-fd40-402e-a82d-2d7954bdca28" containerName="cinder-api-log" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.417000 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7970d67a-9fb0-493c-a143-bc2fee1d4c08" containerName="mariadb-account-create-update" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.417012 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.417025 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02c1955-1938-4cac-b38e-e1eec7332813" containerName="mariadb-database-create" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.417032 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="946c8c31-01d1-45f7-87c2-a022100aeef9" containerName="horizon-log" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.417042 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd76b94-c071-4cae-8a3b-f1ef5a3ed700" containerName="mariadb-account-create-update" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.417057 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8363289-fd40-402e-a82d-2d7954bdca28" containerName="cinder-api-log" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.417064 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8363289-fd40-402e-a82d-2d7954bdca28" containerName="cinder-api" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.417074 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="47393b1e-7c8f-491e-a25a-0ef15e7eef3d" containerName="mariadb-database-create" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.417098 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36880ff-5e50-413f-8213-79ed16bed713" containerName="neutron-api" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.417109 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36880ff-5e50-413f-8213-79ed16bed713" containerName="neutron-httpd" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.417119 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f67dea-44d1-487d-9981-d758db840579" containerName="mariadb-account-create-update" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.417128 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="062a7610-89de-4af4-a35d-b965eff08320" containerName="mariadb-database-create" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.418034 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.422349 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.422545 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.426595 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.436684 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.522292 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4gqz\" (UniqueName: \"kubernetes.io/projected/880f22f8-f3a5-479a-b456-7afdd5e7d96e-kube-api-access-s4gqz\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.522372 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.522406 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-config-data\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.522440 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.522481 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-config-data-custom\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.522654 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-scripts\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.522894 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/880f22f8-f3a5-479a-b456-7afdd5e7d96e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.523058 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/880f22f8-f3a5-479a-b456-7afdd5e7d96e-logs\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.524542 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.626318 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4gqz\" (UniqueName: \"kubernetes.io/projected/880f22f8-f3a5-479a-b456-7afdd5e7d96e-kube-api-access-s4gqz\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.626357 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.626376 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-config-data\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.626393 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.626411 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-config-data-custom\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.626436 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-scripts\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.626483 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/880f22f8-f3a5-479a-b456-7afdd5e7d96e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.626512 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/880f22f8-f3a5-479a-b456-7afdd5e7d96e-logs\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.626531 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.627143 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/880f22f8-f3a5-479a-b456-7afdd5e7d96e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.627564 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/880f22f8-f3a5-479a-b456-7afdd5e7d96e-logs\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.631482 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.631820 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-config-data-custom\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.632685 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.632727 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.633533 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-scripts\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.639282 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880f22f8-f3a5-479a-b456-7afdd5e7d96e-config-data\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.643299 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4gqz\" (UniqueName: \"kubernetes.io/projected/880f22f8-f3a5-479a-b456-7afdd5e7d96e-kube-api-access-s4gqz\") pod \"cinder-api-0\" (UID: \"880f22f8-f3a5-479a-b456-7afdd5e7d96e\") " pod="openstack/cinder-api-0" Feb 17 15:13:02 crc kubenswrapper[4717]: I0217 15:13:02.736046 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 15:13:03 crc kubenswrapper[4717]: I0217 15:13:03.226768 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 15:13:03 crc kubenswrapper[4717]: I0217 15:13:03.288882 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"880f22f8-f3a5-479a-b456-7afdd5e7d96e","Type":"ContainerStarted","Data":"347a601c0221b0f041dc9d7909b253fc302cb375607fb879d859ce9e4edbc261"} Feb 17 15:13:03 crc kubenswrapper[4717]: I0217 15:13:03.874583 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8363289-fd40-402e-a82d-2d7954bdca28" path="/var/lib/kubelet/pods/d8363289-fd40-402e-a82d-2d7954bdca28/volumes" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.213837 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-24w4v"] Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.215252 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.219466 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.220294 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-47zrf" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.220611 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.225129 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-24w4v"] Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.304210 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"880f22f8-f3a5-479a-b456-7afdd5e7d96e","Type":"ContainerStarted","Data":"9c584a5aeb4dcee348d8f2eea0f33ba9d45f62800dc01f37fe888e32e42bf3c6"} Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.371521 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-scripts\") pod \"nova-cell0-conductor-db-sync-24w4v\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.371611 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-config-data\") pod \"nova-cell0-conductor-db-sync-24w4v\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.371710 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-24w4v\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.371917 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr42p\" (UniqueName: \"kubernetes.io/projected/2692b981-8aba-4b0e-b25c-d53a5846e272-kube-api-access-lr42p\") pod \"nova-cell0-conductor-db-sync-24w4v\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.473920 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-scripts\") pod \"nova-cell0-conductor-db-sync-24w4v\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.474003 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-config-data\") pod \"nova-cell0-conductor-db-sync-24w4v\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.474041 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-24w4v\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.474116 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr42p\" (UniqueName: \"kubernetes.io/projected/2692b981-8aba-4b0e-b25c-d53a5846e272-kube-api-access-lr42p\") pod \"nova-cell0-conductor-db-sync-24w4v\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.484373 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-scripts\") pod \"nova-cell0-conductor-db-sync-24w4v\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.484535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-24w4v\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.492053 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-config-data\") pod \"nova-cell0-conductor-db-sync-24w4v\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.496342 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr42p\" (UniqueName: \"kubernetes.io/projected/2692b981-8aba-4b0e-b25c-d53a5846e272-kube-api-access-lr42p\") pod \"nova-cell0-conductor-db-sync-24w4v\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:04 crc kubenswrapper[4717]: I0217 15:13:04.564371 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:05 crc kubenswrapper[4717]: W0217 15:13:05.047056 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2692b981_8aba_4b0e_b25c_d53a5846e272.slice/crio-a68c1384de7033bef50650732b1daa97e7ffedf10b6f10ed32015d813ee462ad WatchSource:0}: Error finding container a68c1384de7033bef50650732b1daa97e7ffedf10b6f10ed32015d813ee462ad: Status 404 returned error can't find the container with id a68c1384de7033bef50650732b1daa97e7ffedf10b6f10ed32015d813ee462ad Feb 17 15:13:05 crc kubenswrapper[4717]: I0217 15:13:05.050753 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-24w4v"] Feb 17 15:13:05 crc kubenswrapper[4717]: I0217 15:13:05.318273 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"880f22f8-f3a5-479a-b456-7afdd5e7d96e","Type":"ContainerStarted","Data":"cb929beded3403dae173cfd50df338b086638c6632283ec4e54326aaea932460"} Feb 17 15:13:05 crc kubenswrapper[4717]: I0217 15:13:05.318700 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 15:13:05 crc kubenswrapper[4717]: I0217 15:13:05.320722 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-24w4v" event={"ID":"2692b981-8aba-4b0e-b25c-d53a5846e272","Type":"ContainerStarted","Data":"a68c1384de7033bef50650732b1daa97e7ffedf10b6f10ed32015d813ee462ad"} Feb 17 15:13:05 crc kubenswrapper[4717]: I0217 15:13:05.355829 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.355802946 podStartE2EDuration="3.355802946s" podCreationTimestamp="2026-02-17 15:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:13:05.341019696 +0000 UTC m=+1251.756860182" watchObservedRunningTime="2026-02-17 15:13:05.355802946 +0000 UTC m=+1251.771643422" Feb 17 15:13:05 crc kubenswrapper[4717]: I0217 15:13:05.919780 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:13:05 crc kubenswrapper[4717]: I0217 15:13:05.920278 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="ceilometer-central-agent" containerID="cri-o://1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713" gracePeriod=30 Feb 17 15:13:05 crc kubenswrapper[4717]: I0217 15:13:05.920454 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="proxy-httpd" containerID="cri-o://7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab" gracePeriod=30 Feb 17 15:13:05 crc kubenswrapper[4717]: I0217 15:13:05.920557 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="ceilometer-notification-agent" containerID="cri-o://c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac" gracePeriod=30 Feb 17 15:13:05 crc kubenswrapper[4717]: I0217 15:13:05.920623 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="sg-core" containerID="cri-o://d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08" gracePeriod=30 Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.353490 4717 generic.go:334] "Generic (PLEG): container finished" podID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerID="7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab" exitCode=0 Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.354732 4717 generic.go:334] "Generic (PLEG): container finished" podID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerID="d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08" exitCode=2 Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.353566 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81029e7d-2f38-4925-a5d2-26d5dfa86fdd","Type":"ContainerDied","Data":"7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab"} Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.355244 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81029e7d-2f38-4925-a5d2-26d5dfa86fdd","Type":"ContainerDied","Data":"d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08"} Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.790260 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.935132 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-sg-core-conf-yaml\") pod \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.935518 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-log-httpd\") pod \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.935578 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-run-httpd\") pod \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.935623 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-combined-ca-bundle\") pod \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.935660 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-config-data\") pod \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.935735 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28xvp\" (UniqueName: \"kubernetes.io/projected/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-kube-api-access-28xvp\") pod \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.935754 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-scripts\") pod \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\" (UID: \"81029e7d-2f38-4925-a5d2-26d5dfa86fdd\") " Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.936295 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "81029e7d-2f38-4925-a5d2-26d5dfa86fdd" (UID: "81029e7d-2f38-4925-a5d2-26d5dfa86fdd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.936809 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "81029e7d-2f38-4925-a5d2-26d5dfa86fdd" (UID: "81029e7d-2f38-4925-a5d2-26d5dfa86fdd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.943336 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-kube-api-access-28xvp" (OuterVolumeSpecName: "kube-api-access-28xvp") pod "81029e7d-2f38-4925-a5d2-26d5dfa86fdd" (UID: "81029e7d-2f38-4925-a5d2-26d5dfa86fdd"). InnerVolumeSpecName "kube-api-access-28xvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.943381 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-scripts" (OuterVolumeSpecName: "scripts") pod "81029e7d-2f38-4925-a5d2-26d5dfa86fdd" (UID: "81029e7d-2f38-4925-a5d2-26d5dfa86fdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:06 crc kubenswrapper[4717]: I0217 15:13:06.968485 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "81029e7d-2f38-4925-a5d2-26d5dfa86fdd" (UID: "81029e7d-2f38-4925-a5d2-26d5dfa86fdd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.017417 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81029e7d-2f38-4925-a5d2-26d5dfa86fdd" (UID: "81029e7d-2f38-4925-a5d2-26d5dfa86fdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.037890 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.037954 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.037976 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28xvp\" (UniqueName: \"kubernetes.io/projected/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-kube-api-access-28xvp\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.037994 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.038005 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.038046 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.054145 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-config-data" (OuterVolumeSpecName: "config-data") pod "81029e7d-2f38-4925-a5d2-26d5dfa86fdd" (UID: "81029e7d-2f38-4925-a5d2-26d5dfa86fdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.140513 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81029e7d-2f38-4925-a5d2-26d5dfa86fdd-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.372120 4717 generic.go:334] "Generic (PLEG): container finished" podID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerID="c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac" exitCode=0 Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.372160 4717 generic.go:334] "Generic (PLEG): container finished" podID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerID="1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713" exitCode=0 Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.372187 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81029e7d-2f38-4925-a5d2-26d5dfa86fdd","Type":"ContainerDied","Data":"c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac"} Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.372230 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81029e7d-2f38-4925-a5d2-26d5dfa86fdd","Type":"ContainerDied","Data":"1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713"} Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.372245 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81029e7d-2f38-4925-a5d2-26d5dfa86fdd","Type":"ContainerDied","Data":"ea056569c0be616a361a60aa302566765219aa36fda6521f9c4a870f9a792edf"} Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.372267 4717 scope.go:117] "RemoveContainer" containerID="7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.372450 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.407631 4717 scope.go:117] "RemoveContainer" containerID="d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.422634 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.435102 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.447674 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:13:07 crc kubenswrapper[4717]: E0217 15:13:07.448141 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="proxy-httpd" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.448157 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="proxy-httpd" Feb 17 15:13:07 crc kubenswrapper[4717]: E0217 15:13:07.448172 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="ceilometer-central-agent" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.448179 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="ceilometer-central-agent" Feb 17 15:13:07 crc kubenswrapper[4717]: E0217 15:13:07.448205 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="ceilometer-notification-agent" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.448212 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="ceilometer-notification-agent" Feb 17 15:13:07 crc kubenswrapper[4717]: E0217 15:13:07.448224 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="sg-core" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.448232 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="sg-core" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.448400 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="ceilometer-notification-agent" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.448420 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="sg-core" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.448433 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="proxy-httpd" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.448441 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" containerName="ceilometer-central-agent" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.450489 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.455421 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.456574 4717 scope.go:117] "RemoveContainer" containerID="c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.456978 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.457228 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.496630 4717 scope.go:117] "RemoveContainer" containerID="1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.519487 4717 scope.go:117] "RemoveContainer" containerID="7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab" Feb 17 15:13:07 crc kubenswrapper[4717]: E0217 15:13:07.521236 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab\": container with ID starting with 7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab not found: ID does not exist" containerID="7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.521270 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab"} err="failed to get container status \"7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab\": rpc error: code = NotFound desc = could not find container \"7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab\": container with ID starting with 7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab not found: ID does not exist" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.521297 4717 scope.go:117] "RemoveContainer" containerID="d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08" Feb 17 15:13:07 crc kubenswrapper[4717]: E0217 15:13:07.521570 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08\": container with ID starting with d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08 not found: ID does not exist" containerID="d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.521589 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08"} err="failed to get container status \"d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08\": rpc error: code = NotFound desc = could not find container \"d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08\": container with ID starting with d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08 not found: ID does not exist" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.521606 4717 scope.go:117] "RemoveContainer" containerID="c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac" Feb 17 15:13:07 crc kubenswrapper[4717]: E0217 15:13:07.522026 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac\": container with ID starting with c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac not found: ID does not exist" containerID="c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.522362 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac"} err="failed to get container status \"c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac\": rpc error: code = NotFound desc = could not find container \"c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac\": container with ID starting with c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac not found: ID does not exist" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.522472 4717 scope.go:117] "RemoveContainer" containerID="1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713" Feb 17 15:13:07 crc kubenswrapper[4717]: E0217 15:13:07.522818 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713\": container with ID starting with 1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713 not found: ID does not exist" containerID="1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.522844 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713"} err="failed to get container status \"1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713\": rpc error: code = NotFound desc = could not find container \"1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713\": container with ID starting with 1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713 not found: ID does not exist" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.522864 4717 scope.go:117] "RemoveContainer" containerID="7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.523276 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab"} err="failed to get container status \"7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab\": rpc error: code = NotFound desc = could not find container \"7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab\": container with ID starting with 7b22273d051f1e3e51e3a1851a083605662073b921dd6ab81dd878f1e25b30ab not found: ID does not exist" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.523307 4717 scope.go:117] "RemoveContainer" containerID="d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.523599 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08"} err="failed to get container status \"d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08\": rpc error: code = NotFound desc = could not find container \"d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08\": container with ID starting with d16138243a610841610cb786d4b8e67571c72e5ee37914fb887907b224abdb08 not found: ID does not exist" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.523627 4717 scope.go:117] "RemoveContainer" containerID="c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.523979 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac"} err="failed to get container status \"c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac\": rpc error: code = NotFound desc = could not find container \"c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac\": container with ID starting with c380c0bd3a389b90efb36446e03e8053952e0f5ebcd52c18bcf28a4cd8cf0aac not found: ID does not exist" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.524002 4717 scope.go:117] "RemoveContainer" containerID="1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.524264 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713"} err="failed to get container status \"1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713\": rpc error: code = NotFound desc = could not find container \"1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713\": container with ID starting with 1f361986bfefeea8703231d7ef54e73112e58eed3a15ca59f9eb6a3e7d04c713 not found: ID does not exist" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.550379 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.550469 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.550513 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-log-httpd\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.550565 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-config-data\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.550644 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-scripts\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.550681 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj9hw\" (UniqueName: \"kubernetes.io/projected/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-kube-api-access-rj9hw\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.550714 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-run-httpd\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.653137 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj9hw\" (UniqueName: \"kubernetes.io/projected/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-kube-api-access-rj9hw\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.653206 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-run-httpd\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.653267 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.653338 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.653387 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-log-httpd\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.653433 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-config-data\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.653504 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-scripts\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.655075 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-log-httpd\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.655373 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-run-httpd\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.657345 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-scripts\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.668512 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.669397 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-config-data\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.670658 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.673529 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj9hw\" (UniqueName: \"kubernetes.io/projected/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-kube-api-access-rj9hw\") pod \"ceilometer-0\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.781427 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:13:07 crc kubenswrapper[4717]: I0217 15:13:07.873672 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81029e7d-2f38-4925-a5d2-26d5dfa86fdd" path="/var/lib/kubelet/pods/81029e7d-2f38-4925-a5d2-26d5dfa86fdd/volumes" Feb 17 15:13:08 crc kubenswrapper[4717]: I0217 15:13:08.049828 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:13:08 crc kubenswrapper[4717]: I0217 15:13:08.252030 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:13:08 crc kubenswrapper[4717]: W0217 15:13:08.265412 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c93d1cf_f6e2_4e06_8534_dca9b9e29e9d.slice/crio-9cf18eef708a0be142555e12bd12430b0a25b303670e1bc912162838e43dc881 WatchSource:0}: Error finding container 9cf18eef708a0be142555e12bd12430b0a25b303670e1bc912162838e43dc881: Status 404 returned error can't find the container with id 9cf18eef708a0be142555e12bd12430b0a25b303670e1bc912162838e43dc881 Feb 17 15:13:08 crc kubenswrapper[4717]: I0217 15:13:08.386647 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d","Type":"ContainerStarted","Data":"9cf18eef708a0be142555e12bd12430b0a25b303670e1bc912162838e43dc881"} Feb 17 15:13:09 crc kubenswrapper[4717]: I0217 15:13:09.320022 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:13:09 crc kubenswrapper[4717]: I0217 15:13:09.320272 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c7c915fe-fb2a-4325-86c0-165aa90cf60f" containerName="glance-log" containerID="cri-o://bbf2b95657c669613cc3cb5c6dfdbdd07008aa6a16efec98184347f95be3369c" gracePeriod=30 Feb 17 15:13:09 crc kubenswrapper[4717]: I0217 15:13:09.320539 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c7c915fe-fb2a-4325-86c0-165aa90cf60f" containerName="glance-httpd" containerID="cri-o://b07b381180bc0ff47bb23b19d1309dceb471c7bbc5bca8ad21e62c5b085b0021" gracePeriod=30 Feb 17 15:13:10 crc kubenswrapper[4717]: I0217 15:13:10.413724 4717 generic.go:334] "Generic (PLEG): container finished" podID="c7c915fe-fb2a-4325-86c0-165aa90cf60f" containerID="bbf2b95657c669613cc3cb5c6dfdbdd07008aa6a16efec98184347f95be3369c" exitCode=143 Feb 17 15:13:10 crc kubenswrapper[4717]: I0217 15:13:10.413831 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c915fe-fb2a-4325-86c0-165aa90cf60f","Type":"ContainerDied","Data":"bbf2b95657c669613cc3cb5c6dfdbdd07008aa6a16efec98184347f95be3369c"} Feb 17 15:13:10 crc kubenswrapper[4717]: I0217 15:13:10.613097 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:13:10 crc kubenswrapper[4717]: I0217 15:13:10.614762 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="90dd9df8-232b-4a2c-a750-4ad7209404b3" containerName="glance-log" containerID="cri-o://e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc" gracePeriod=30 Feb 17 15:13:10 crc kubenswrapper[4717]: I0217 15:13:10.614912 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="90dd9df8-232b-4a2c-a750-4ad7209404b3" containerName="glance-httpd" containerID="cri-o://842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789" gracePeriod=30 Feb 17 15:13:11 crc kubenswrapper[4717]: I0217 15:13:11.439539 4717 generic.go:334] "Generic (PLEG): container finished" podID="90dd9df8-232b-4a2c-a750-4ad7209404b3" containerID="e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc" exitCode=143 Feb 17 15:13:11 crc kubenswrapper[4717]: I0217 15:13:11.439856 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"90dd9df8-232b-4a2c-a750-4ad7209404b3","Type":"ContainerDied","Data":"e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc"} Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.476853 4717 generic.go:334] "Generic (PLEG): container finished" podID="c7c915fe-fb2a-4325-86c0-165aa90cf60f" containerID="b07b381180bc0ff47bb23b19d1309dceb471c7bbc5bca8ad21e62c5b085b0021" exitCode=0 Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.477013 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c915fe-fb2a-4325-86c0-165aa90cf60f","Type":"ContainerDied","Data":"b07b381180bc0ff47bb23b19d1309dceb471c7bbc5bca8ad21e62c5b085b0021"} Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.690777 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.821515 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.821603 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-public-tls-certs\") pod \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.821635 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb6cj\" (UniqueName: \"kubernetes.io/projected/c7c915fe-fb2a-4325-86c0-165aa90cf60f-kube-api-access-cb6cj\") pod \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.821689 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-combined-ca-bundle\") pod \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.821745 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-config-data\") pod \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.821866 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c915fe-fb2a-4325-86c0-165aa90cf60f-logs\") pod \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.821986 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-scripts\") pod \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.822020 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c915fe-fb2a-4325-86c0-165aa90cf60f-httpd-run\") pod \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\" (UID: \"c7c915fe-fb2a-4325-86c0-165aa90cf60f\") " Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.822854 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7c915fe-fb2a-4325-86c0-165aa90cf60f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c7c915fe-fb2a-4325-86c0-165aa90cf60f" (UID: "c7c915fe-fb2a-4325-86c0-165aa90cf60f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.823296 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7c915fe-fb2a-4325-86c0-165aa90cf60f-logs" (OuterVolumeSpecName: "logs") pod "c7c915fe-fb2a-4325-86c0-165aa90cf60f" (UID: "c7c915fe-fb2a-4325-86c0-165aa90cf60f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.833723 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "c7c915fe-fb2a-4325-86c0-165aa90cf60f" (UID: "c7c915fe-fb2a-4325-86c0-165aa90cf60f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.834054 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7c915fe-fb2a-4325-86c0-165aa90cf60f-kube-api-access-cb6cj" (OuterVolumeSpecName: "kube-api-access-cb6cj") pod "c7c915fe-fb2a-4325-86c0-165aa90cf60f" (UID: "c7c915fe-fb2a-4325-86c0-165aa90cf60f"). InnerVolumeSpecName "kube-api-access-cb6cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.834069 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-scripts" (OuterVolumeSpecName: "scripts") pod "c7c915fe-fb2a-4325-86c0-165aa90cf60f" (UID: "c7c915fe-fb2a-4325-86c0-165aa90cf60f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.863072 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7c915fe-fb2a-4325-86c0-165aa90cf60f" (UID: "c7c915fe-fb2a-4325-86c0-165aa90cf60f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.907807 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c7c915fe-fb2a-4325-86c0-165aa90cf60f" (UID: "c7c915fe-fb2a-4325-86c0-165aa90cf60f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.909093 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-config-data" (OuterVolumeSpecName: "config-data") pod "c7c915fe-fb2a-4325-86c0-165aa90cf60f" (UID: "c7c915fe-fb2a-4325-86c0-165aa90cf60f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.924894 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.924929 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c915fe-fb2a-4325-86c0-165aa90cf60f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.924957 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.924966 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.924979 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb6cj\" (UniqueName: \"kubernetes.io/projected/c7c915fe-fb2a-4325-86c0-165aa90cf60f-kube-api-access-cb6cj\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.924988 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.924996 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c915fe-fb2a-4325-86c0-165aa90cf60f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.925004 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c915fe-fb2a-4325-86c0-165aa90cf60f-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:13 crc kubenswrapper[4717]: I0217 15:13:13.959638 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.027235 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.395900 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.502401 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-24w4v" event={"ID":"2692b981-8aba-4b0e-b25c-d53a5846e272","Type":"ContainerStarted","Data":"5f3c66ec3024fd89a219ff95045d9d9435a2abf6bfbc11ee51d5732acf582ac4"} Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.518615 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c915fe-fb2a-4325-86c0-165aa90cf60f","Type":"ContainerDied","Data":"623a425c8d49ca98e0cc6a63ae67a8dda00da468286a3c09906f11f56cdf7540"} Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.518675 4717 scope.go:117] "RemoveContainer" containerID="b07b381180bc0ff47bb23b19d1309dceb471c7bbc5bca8ad21e62c5b085b0021" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.518812 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.534990 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq9pc\" (UniqueName: \"kubernetes.io/projected/90dd9df8-232b-4a2c-a750-4ad7209404b3-kube-api-access-vq9pc\") pod \"90dd9df8-232b-4a2c-a750-4ad7209404b3\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.535045 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"90dd9df8-232b-4a2c-a750-4ad7209404b3\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.535103 4717 generic.go:334] "Generic (PLEG): container finished" podID="90dd9df8-232b-4a2c-a750-4ad7209404b3" containerID="842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789" exitCode=0 Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.535171 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"90dd9df8-232b-4a2c-a750-4ad7209404b3","Type":"ContainerDied","Data":"842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789"} Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.535212 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-scripts\") pod \"90dd9df8-232b-4a2c-a750-4ad7209404b3\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.535220 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"90dd9df8-232b-4a2c-a750-4ad7209404b3","Type":"ContainerDied","Data":"64194f8ca0f321a2172a16a43c665f7f2dfdc1c8d447788cd550fc543cfed354"} Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.535288 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.535318 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-internal-tls-certs\") pod \"90dd9df8-232b-4a2c-a750-4ad7209404b3\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.535355 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-config-data\") pod \"90dd9df8-232b-4a2c-a750-4ad7209404b3\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.535399 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-combined-ca-bundle\") pod \"90dd9df8-232b-4a2c-a750-4ad7209404b3\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.535507 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90dd9df8-232b-4a2c-a750-4ad7209404b3-httpd-run\") pod \"90dd9df8-232b-4a2c-a750-4ad7209404b3\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.535535 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90dd9df8-232b-4a2c-a750-4ad7209404b3-logs\") pod \"90dd9df8-232b-4a2c-a750-4ad7209404b3\" (UID: \"90dd9df8-232b-4a2c-a750-4ad7209404b3\") " Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.536525 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90dd9df8-232b-4a2c-a750-4ad7209404b3-logs" (OuterVolumeSpecName: "logs") pod "90dd9df8-232b-4a2c-a750-4ad7209404b3" (UID: "90dd9df8-232b-4a2c-a750-4ad7209404b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.538594 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90dd9df8-232b-4a2c-a750-4ad7209404b3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "90dd9df8-232b-4a2c-a750-4ad7209404b3" (UID: "90dd9df8-232b-4a2c-a750-4ad7209404b3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.554450 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d","Type":"ContainerStarted","Data":"72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77"} Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.554746 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d","Type":"ContainerStarted","Data":"6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1"} Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.554884 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90dd9df8-232b-4a2c-a750-4ad7209404b3-kube-api-access-vq9pc" (OuterVolumeSpecName: "kube-api-access-vq9pc") pod "90dd9df8-232b-4a2c-a750-4ad7209404b3" (UID: "90dd9df8-232b-4a2c-a750-4ad7209404b3"). InnerVolumeSpecName "kube-api-access-vq9pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.556206 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-scripts" (OuterVolumeSpecName: "scripts") pod "90dd9df8-232b-4a2c-a750-4ad7209404b3" (UID: "90dd9df8-232b-4a2c-a750-4ad7209404b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.574492 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "90dd9df8-232b-4a2c-a750-4ad7209404b3" (UID: "90dd9df8-232b-4a2c-a750-4ad7209404b3"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.587158 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90dd9df8-232b-4a2c-a750-4ad7209404b3" (UID: "90dd9df8-232b-4a2c-a750-4ad7209404b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.621901 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-config-data" (OuterVolumeSpecName: "config-data") pod "90dd9df8-232b-4a2c-a750-4ad7209404b3" (UID: "90dd9df8-232b-4a2c-a750-4ad7209404b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.639785 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90dd9df8-232b-4a2c-a750-4ad7209404b3-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.639838 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90dd9df8-232b-4a2c-a750-4ad7209404b3-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.639851 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq9pc\" (UniqueName: \"kubernetes.io/projected/90dd9df8-232b-4a2c-a750-4ad7209404b3-kube-api-access-vq9pc\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.639878 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.639890 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.639925 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.639935 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.664621 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.682716 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "90dd9df8-232b-4a2c-a750-4ad7209404b3" (UID: "90dd9df8-232b-4a2c-a750-4ad7209404b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.690898 4717 scope.go:117] "RemoveContainer" containerID="bbf2b95657c669613cc3cb5c6dfdbdd07008aa6a16efec98184347f95be3369c" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.697191 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-24w4v" podStartSLOduration=2.368516272 podStartE2EDuration="10.697165918s" podCreationTimestamp="2026-02-17 15:13:04 +0000 UTC" firstStartedPulling="2026-02-17 15:13:05.050815081 +0000 UTC m=+1251.466655557" lastFinishedPulling="2026-02-17 15:13:13.379464727 +0000 UTC m=+1259.795305203" observedRunningTime="2026-02-17 15:13:14.531121046 +0000 UTC m=+1260.946961532" watchObservedRunningTime="2026-02-17 15:13:14.697165918 +0000 UTC m=+1261.113006384" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.700409 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.710881 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.725503 4717 scope.go:117] "RemoveContainer" containerID="842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.735336 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:13:14 crc kubenswrapper[4717]: E0217 15:13:14.735874 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c915fe-fb2a-4325-86c0-165aa90cf60f" containerName="glance-httpd" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.735895 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c915fe-fb2a-4325-86c0-165aa90cf60f" containerName="glance-httpd" Feb 17 15:13:14 crc kubenswrapper[4717]: E0217 15:13:14.735906 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90dd9df8-232b-4a2c-a750-4ad7209404b3" containerName="glance-httpd" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.735912 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="90dd9df8-232b-4a2c-a750-4ad7209404b3" containerName="glance-httpd" Feb 17 15:13:14 crc kubenswrapper[4717]: E0217 15:13:14.735924 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90dd9df8-232b-4a2c-a750-4ad7209404b3" containerName="glance-log" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.735930 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="90dd9df8-232b-4a2c-a750-4ad7209404b3" containerName="glance-log" Feb 17 15:13:14 crc kubenswrapper[4717]: E0217 15:13:14.735943 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c915fe-fb2a-4325-86c0-165aa90cf60f" containerName="glance-log" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.735950 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c915fe-fb2a-4325-86c0-165aa90cf60f" containerName="glance-log" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.736145 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c915fe-fb2a-4325-86c0-165aa90cf60f" containerName="glance-log" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.736161 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="90dd9df8-232b-4a2c-a750-4ad7209404b3" containerName="glance-httpd" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.736179 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c915fe-fb2a-4325-86c0-165aa90cf60f" containerName="glance-httpd" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.736192 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="90dd9df8-232b-4a2c-a750-4ad7209404b3" containerName="glance-log" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.737236 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.741456 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.741494 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90dd9df8-232b-4a2c-a750-4ad7209404b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.742575 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.750532 4717 scope.go:117] "RemoveContainer" containerID="e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.750815 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.751037 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.789464 4717 scope.go:117] "RemoveContainer" containerID="842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789" Feb 17 15:13:14 crc kubenswrapper[4717]: E0217 15:13:14.790473 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789\": container with ID starting with 842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789 not found: ID does not exist" containerID="842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.790539 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789"} err="failed to get container status \"842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789\": rpc error: code = NotFound desc = could not find container \"842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789\": container with ID starting with 842d7d743588eba191958cd82adc0adfd55e85815f1a8c6414ff7d288928e789 not found: ID does not exist" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.790566 4717 scope.go:117] "RemoveContainer" containerID="e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc" Feb 17 15:13:14 crc kubenswrapper[4717]: E0217 15:13:14.791121 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc\": container with ID starting with e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc not found: ID does not exist" containerID="e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.791175 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc"} err="failed to get container status \"e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc\": rpc error: code = NotFound desc = could not find container \"e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc\": container with ID starting with e629782d2ea9565384b3c751d85518342cef238f804beef34712b3d19d53f1bc not found: ID does not exist" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.844203 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.844274 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwnvw\" (UniqueName: \"kubernetes.io/projected/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-kube-api-access-zwnvw\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.844316 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.844355 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-config-data\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.844378 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-scripts\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.844401 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.844418 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-logs\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.844486 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.876691 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.894129 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.908591 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.909970 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.912991 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.913466 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.917859 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.947441 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.947486 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-logs\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.947596 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.947655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.947689 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwnvw\" (UniqueName: \"kubernetes.io/projected/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-kube-api-access-zwnvw\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.947737 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.947797 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-config-data\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.947819 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-scripts\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.948830 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.949714 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-logs\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.950006 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.952627 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-scripts\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.960611 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.960870 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-config-data\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.961601 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.968055 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwnvw\" (UniqueName: \"kubernetes.io/projected/f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f-kube-api-access-zwnvw\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:14 crc kubenswrapper[4717]: I0217 15:13:14.993170 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f\") " pod="openstack/glance-default-external-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.050058 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.050126 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47f318-2779-4288-b0ce-775766436b6b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.050150 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47f318-2779-4288-b0ce-775766436b6b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.050194 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47f318-2779-4288-b0ce-775766436b6b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.050251 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47f318-2779-4288-b0ce-775766436b6b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.050291 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47f318-2779-4288-b0ce-775766436b6b-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.050778 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.051268 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kwfs\" (UniqueName: \"kubernetes.io/projected/cb47f318-2779-4288-b0ce-775766436b6b-kube-api-access-8kwfs\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.051312 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47f318-2779-4288-b0ce-775766436b6b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.066950 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.152349 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47f318-2779-4288-b0ce-775766436b6b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.152413 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47f318-2779-4288-b0ce-775766436b6b-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.152440 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kwfs\" (UniqueName: \"kubernetes.io/projected/cb47f318-2779-4288-b0ce-775766436b6b-kube-api-access-8kwfs\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.152466 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47f318-2779-4288-b0ce-775766436b6b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.152503 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.152532 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47f318-2779-4288-b0ce-775766436b6b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.152558 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47f318-2779-4288-b0ce-775766436b6b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.152636 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47f318-2779-4288-b0ce-775766436b6b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.153048 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47f318-2779-4288-b0ce-775766436b6b-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.153209 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47f318-2779-4288-b0ce-775766436b6b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.153490 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.165645 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47f318-2779-4288-b0ce-775766436b6b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.166706 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47f318-2779-4288-b0ce-775766436b6b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.167729 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47f318-2779-4288-b0ce-775766436b6b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.171580 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kwfs\" (UniqueName: \"kubernetes.io/projected/cb47f318-2779-4288-b0ce-775766436b6b-kube-api-access-8kwfs\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.173606 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47f318-2779-4288-b0ce-775766436b6b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.181666 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb47f318-2779-4288-b0ce-775766436b6b\") " pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.238449 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.577146 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d","Type":"ContainerStarted","Data":"089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff"} Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.733248 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 15:13:15 crc kubenswrapper[4717]: W0217 15:13:15.734184 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf91f5d0e_cbf8_4977_b9a6_1bfffe081b2f.slice/crio-d4148a42c549daeb6a2c3029980e1b7ee64a767e8ece9b52c863b6e66292ca57 WatchSource:0}: Error finding container d4148a42c549daeb6a2c3029980e1b7ee64a767e8ece9b52c863b6e66292ca57: Status 404 returned error can't find the container with id d4148a42c549daeb6a2c3029980e1b7ee64a767e8ece9b52c863b6e66292ca57 Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.883947 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90dd9df8-232b-4a2c-a750-4ad7209404b3" path="/var/lib/kubelet/pods/90dd9df8-232b-4a2c-a750-4ad7209404b3/volumes" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.885737 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7c915fe-fb2a-4325-86c0-165aa90cf60f" path="/var/lib/kubelet/pods/c7c915fe-fb2a-4325-86c0-165aa90cf60f/volumes" Feb 17 15:13:15 crc kubenswrapper[4717]: I0217 15:13:15.886400 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 15:13:16 crc kubenswrapper[4717]: I0217 15:13:16.617370 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb47f318-2779-4288-b0ce-775766436b6b","Type":"ContainerStarted","Data":"239b78d1d9abb637f6b981bb5679b65bba924b9d947a7885b619f31bd263bc9d"} Feb 17 15:13:16 crc kubenswrapper[4717]: I0217 15:13:16.619837 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f","Type":"ContainerStarted","Data":"33b92dea083d79bf4634542a8c802bace0f44354c7cdb680456c22c0aeccc81d"} Feb 17 15:13:16 crc kubenswrapper[4717]: I0217 15:13:16.619860 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f","Type":"ContainerStarted","Data":"d4148a42c549daeb6a2c3029980e1b7ee64a767e8ece9b52c863b6e66292ca57"} Feb 17 15:13:17 crc kubenswrapper[4717]: I0217 15:13:17.630040 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb47f318-2779-4288-b0ce-775766436b6b","Type":"ContainerStarted","Data":"3e0ff08260864aa3b7fc2f47f705ce40edd0e4f67d85409888ce92f5a2e91afa"} Feb 17 15:13:17 crc kubenswrapper[4717]: I0217 15:13:17.630670 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb47f318-2779-4288-b0ce-775766436b6b","Type":"ContainerStarted","Data":"8087b33b43f8b46346dc3a68ca539c6a839bfad658379c2849b01cce39d307c2"} Feb 17 15:13:17 crc kubenswrapper[4717]: I0217 15:13:17.631923 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f","Type":"ContainerStarted","Data":"a4cebb78fb5aec31e8e2d6b4508cdbd8ddc195e11389edaabdfa4ccfa166cf3e"} Feb 17 15:13:17 crc kubenswrapper[4717]: I0217 15:13:17.637043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d","Type":"ContainerStarted","Data":"1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88"} Feb 17 15:13:17 crc kubenswrapper[4717]: I0217 15:13:17.637214 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="ceilometer-central-agent" containerID="cri-o://6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1" gracePeriod=30 Feb 17 15:13:17 crc kubenswrapper[4717]: I0217 15:13:17.637315 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 15:13:17 crc kubenswrapper[4717]: I0217 15:13:17.637365 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="proxy-httpd" containerID="cri-o://1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88" gracePeriod=30 Feb 17 15:13:17 crc kubenswrapper[4717]: I0217 15:13:17.637414 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="sg-core" containerID="cri-o://089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff" gracePeriod=30 Feb 17 15:13:17 crc kubenswrapper[4717]: I0217 15:13:17.637456 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="ceilometer-notification-agent" containerID="cri-o://72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77" gracePeriod=30 Feb 17 15:13:17 crc kubenswrapper[4717]: I0217 15:13:17.660493 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.660474245 podStartE2EDuration="3.660474245s" podCreationTimestamp="2026-02-17 15:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:13:17.656234235 +0000 UTC m=+1264.072074721" watchObservedRunningTime="2026-02-17 15:13:17.660474245 +0000 UTC m=+1264.076314721" Feb 17 15:13:17 crc kubenswrapper[4717]: I0217 15:13:17.689941 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.689337494 podStartE2EDuration="3.689337494s" podCreationTimestamp="2026-02-17 15:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:13:17.676815659 +0000 UTC m=+1264.092656155" watchObservedRunningTime="2026-02-17 15:13:17.689337494 +0000 UTC m=+1264.105178010" Feb 17 15:13:17 crc kubenswrapper[4717]: I0217 15:13:17.716738 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2204603 podStartE2EDuration="10.716716081s" podCreationTimestamp="2026-02-17 15:13:07 +0000 UTC" firstStartedPulling="2026-02-17 15:13:08.267347544 +0000 UTC m=+1254.683188020" lastFinishedPulling="2026-02-17 15:13:16.763603325 +0000 UTC m=+1263.179443801" observedRunningTime="2026-02-17 15:13:17.709560208 +0000 UTC m=+1264.125400704" watchObservedRunningTime="2026-02-17 15:13:17.716716081 +0000 UTC m=+1264.132556557" Feb 17 15:13:18 crc kubenswrapper[4717]: I0217 15:13:18.650314 4717 generic.go:334] "Generic (PLEG): container finished" podID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerID="1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88" exitCode=0 Feb 17 15:13:18 crc kubenswrapper[4717]: I0217 15:13:18.650628 4717 generic.go:334] "Generic (PLEG): container finished" podID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerID="089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff" exitCode=2 Feb 17 15:13:18 crc kubenswrapper[4717]: I0217 15:13:18.650637 4717 generic.go:334] "Generic (PLEG): container finished" podID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerID="72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77" exitCode=0 Feb 17 15:13:18 crc kubenswrapper[4717]: I0217 15:13:18.650364 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d","Type":"ContainerDied","Data":"1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88"} Feb 17 15:13:18 crc kubenswrapper[4717]: I0217 15:13:18.650707 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d","Type":"ContainerDied","Data":"089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff"} Feb 17 15:13:18 crc kubenswrapper[4717]: I0217 15:13:18.650731 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d","Type":"ContainerDied","Data":"72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77"} Feb 17 15:13:20 crc kubenswrapper[4717]: I0217 15:13:20.808877 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:13:20 crc kubenswrapper[4717]: I0217 15:13:20.809258 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:13:20 crc kubenswrapper[4717]: I0217 15:13:20.809308 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 15:13:20 crc kubenswrapper[4717]: I0217 15:13:20.810046 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f8e20bac773d3781e4315850afd9f8f1df648a8ef53c688d37ae5161d1be4600"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:13:20 crc kubenswrapper[4717]: I0217 15:13:20.810135 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://f8e20bac773d3781e4315850afd9f8f1df648a8ef53c688d37ae5161d1be4600" gracePeriod=600 Feb 17 15:13:21 crc kubenswrapper[4717]: I0217 15:13:21.678486 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="f8e20bac773d3781e4315850afd9f8f1df648a8ef53c688d37ae5161d1be4600" exitCode=0 Feb 17 15:13:21 crc kubenswrapper[4717]: I0217 15:13:21.678566 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"f8e20bac773d3781e4315850afd9f8f1df648a8ef53c688d37ae5161d1be4600"} Feb 17 15:13:21 crc kubenswrapper[4717]: I0217 15:13:21.678881 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"fdf1380c8cd7cb66606575dee73871bdf907cf6128e72b679677baf87d933dc1"} Feb 17 15:13:21 crc kubenswrapper[4717]: I0217 15:13:21.678903 4717 scope.go:117] "RemoveContainer" containerID="894debb3d49a5afafc8d152c1e296cd8509036d91968011b7ffc16cede4826fe" Feb 17 15:13:25 crc kubenswrapper[4717]: I0217 15:13:25.067653 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 15:13:25 crc kubenswrapper[4717]: I0217 15:13:25.069151 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 15:13:25 crc kubenswrapper[4717]: I0217 15:13:25.102602 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 15:13:25 crc kubenswrapper[4717]: I0217 15:13:25.127200 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 15:13:25 crc kubenswrapper[4717]: I0217 15:13:25.241924 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:25 crc kubenswrapper[4717]: I0217 15:13:25.241973 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:25 crc kubenswrapper[4717]: I0217 15:13:25.284633 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:25 crc kubenswrapper[4717]: I0217 15:13:25.298288 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:25 crc kubenswrapper[4717]: I0217 15:13:25.715569 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 15:13:25 crc kubenswrapper[4717]: I0217 15:13:25.715935 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 15:13:25 crc kubenswrapper[4717]: I0217 15:13:25.715950 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:25 crc kubenswrapper[4717]: I0217 15:13:25.715962 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:27 crc kubenswrapper[4717]: E0217 15:13:27.124253 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c93d1cf_f6e2_4e06_8534_dca9b9e29e9d.slice/crio-6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c93d1cf_f6e2_4e06_8534_dca9b9e29e9d.slice/crio-conmon-6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1.scope\": RecentStats: unable to find data in memory cache]" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.444321 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.622695 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-combined-ca-bundle\") pod \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.622878 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj9hw\" (UniqueName: \"kubernetes.io/projected/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-kube-api-access-rj9hw\") pod \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.623016 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-run-httpd\") pod \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.623066 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-scripts\") pod \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.623152 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-log-httpd\") pod \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.623191 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-config-data\") pod \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.623218 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-sg-core-conf-yaml\") pod \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\" (UID: \"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d\") " Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.624926 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" (UID: "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.625113 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" (UID: "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.629320 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-kube-api-access-rj9hw" (OuterVolumeSpecName: "kube-api-access-rj9hw") pod "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" (UID: "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d"). InnerVolumeSpecName "kube-api-access-rj9hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.651455 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-scripts" (OuterVolumeSpecName: "scripts") pod "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" (UID: "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.662789 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" (UID: "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.708360 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" (UID: "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.712405 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-config-data" (OuterVolumeSpecName: "config-data") pod "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" (UID: "7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.712635 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.713538 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.727305 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.727383 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj9hw\" (UniqueName: \"kubernetes.io/projected/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-kube-api-access-rj9hw\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.727400 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.727410 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.727420 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.727428 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.727437 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.792893 4717 generic.go:334] "Generic (PLEG): container finished" podID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerID="6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1" exitCode=0 Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.795053 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.795073 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.795030 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d","Type":"ContainerDied","Data":"6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1"} Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.795231 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d","Type":"ContainerDied","Data":"9cf18eef708a0be142555e12bd12430b0a25b303670e1bc912162838e43dc881"} Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.795273 4717 scope.go:117] "RemoveContainer" containerID="1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.795164 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.808383 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.809735 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.836772 4717 scope.go:117] "RemoveContainer" containerID="089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.877473 4717 scope.go:117] "RemoveContainer" containerID="72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.901387 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.901421 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.916585 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:13:27 crc kubenswrapper[4717]: E0217 15:13:27.917115 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="proxy-httpd" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.917132 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="proxy-httpd" Feb 17 15:13:27 crc kubenswrapper[4717]: E0217 15:13:27.917161 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="ceilometer-central-agent" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.917169 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="ceilometer-central-agent" Feb 17 15:13:27 crc kubenswrapper[4717]: E0217 15:13:27.917180 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="sg-core" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.917188 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="sg-core" Feb 17 15:13:27 crc kubenswrapper[4717]: E0217 15:13:27.917204 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="ceilometer-notification-agent" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.917213 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="ceilometer-notification-agent" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.917419 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="sg-core" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.917437 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="ceilometer-central-agent" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.917450 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="proxy-httpd" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.917468 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" containerName="ceilometer-notification-agent" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.919982 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.931536 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.936873 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.949271 4717 scope.go:117] "RemoveContainer" containerID="6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1" Feb 17 15:13:27 crc kubenswrapper[4717]: I0217 15:13:27.949861 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.049278 4717 scope.go:117] "RemoveContainer" containerID="1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88" Feb 17 15:13:28 crc kubenswrapper[4717]: E0217 15:13:28.061247 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88\": container with ID starting with 1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88 not found: ID does not exist" containerID="1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.061296 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88"} err="failed to get container status \"1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88\": rpc error: code = NotFound desc = could not find container \"1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88\": container with ID starting with 1f0059f93f085d72128d05b33eec68169a0a1904b9706a0f1b5ecc83daaf7f88 not found: ID does not exist" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.061331 4717 scope.go:117] "RemoveContainer" containerID="089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff" Feb 17 15:13:28 crc kubenswrapper[4717]: E0217 15:13:28.065258 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff\": container with ID starting with 089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff not found: ID does not exist" containerID="089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.065292 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff"} err="failed to get container status \"089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff\": rpc error: code = NotFound desc = could not find container \"089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff\": container with ID starting with 089e926390b006c5efc7810ddeb093e253227dd03dbd41c7d2a5c1b8ebdd58ff not found: ID does not exist" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.065335 4717 scope.go:117] "RemoveContainer" containerID="72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.065393 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.065447 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-scripts\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.065504 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-config-data\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.065531 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.065573 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb760d7-52d5-42d6-9139-f87799aa88b9-run-httpd\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.065616 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb760d7-52d5-42d6-9139-f87799aa88b9-log-httpd\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.065651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct274\" (UniqueName: \"kubernetes.io/projected/7cb760d7-52d5-42d6-9139-f87799aa88b9-kube-api-access-ct274\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: E0217 15:13:28.067651 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77\": container with ID starting with 72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77 not found: ID does not exist" containerID="72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.067677 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77"} err="failed to get container status \"72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77\": rpc error: code = NotFound desc = could not find container \"72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77\": container with ID starting with 72db533c7e8d95abbc8ffb5c8912bdf57d3b61387476d9d65fa8764e48702f77 not found: ID does not exist" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.067700 4717 scope.go:117] "RemoveContainer" containerID="6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1" Feb 17 15:13:28 crc kubenswrapper[4717]: E0217 15:13:28.072182 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1\": container with ID starting with 6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1 not found: ID does not exist" containerID="6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.072210 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1"} err="failed to get container status \"6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1\": rpc error: code = NotFound desc = could not find container \"6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1\": container with ID starting with 6e3dd124225ce062cbac1ae428bb2432d484af22975dc7e6663bb54c0d629fc1 not found: ID does not exist" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.167920 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-config-data\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.167993 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.168058 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb760d7-52d5-42d6-9139-f87799aa88b9-run-httpd\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.168138 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb760d7-52d5-42d6-9139-f87799aa88b9-log-httpd\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.168185 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct274\" (UniqueName: \"kubernetes.io/projected/7cb760d7-52d5-42d6-9139-f87799aa88b9-kube-api-access-ct274\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.168241 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.168274 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-scripts\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.169377 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb760d7-52d5-42d6-9139-f87799aa88b9-run-httpd\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.171376 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb760d7-52d5-42d6-9139-f87799aa88b9-log-httpd\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.177182 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-config-data\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.184595 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-scripts\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.185239 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.207860 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.216790 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct274\" (UniqueName: \"kubernetes.io/projected/7cb760d7-52d5-42d6-9139-f87799aa88b9-kube-api-access-ct274\") pod \"ceilometer-0\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.331512 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.805112 4717 generic.go:334] "Generic (PLEG): container finished" podID="2692b981-8aba-4b0e-b25c-d53a5846e272" containerID="5f3c66ec3024fd89a219ff95045d9d9435a2abf6bfbc11ee51d5732acf582ac4" exitCode=0 Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.805182 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-24w4v" event={"ID":"2692b981-8aba-4b0e-b25c-d53a5846e272","Type":"ContainerDied","Data":"5f3c66ec3024fd89a219ff95045d9d9435a2abf6bfbc11ee51d5732acf582ac4"} Feb 17 15:13:28 crc kubenswrapper[4717]: W0217 15:13:28.822182 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cb760d7_52d5_42d6_9139_f87799aa88b9.slice/crio-ac459812fa123f127ba2b13703eafee5469f551351db00806adf895971e35456 WatchSource:0}: Error finding container ac459812fa123f127ba2b13703eafee5469f551351db00806adf895971e35456: Status 404 returned error can't find the container with id ac459812fa123f127ba2b13703eafee5469f551351db00806adf895971e35456 Feb 17 15:13:28 crc kubenswrapper[4717]: I0217 15:13:28.828890 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:13:29 crc kubenswrapper[4717]: I0217 15:13:29.825546 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cb760d7-52d5-42d6-9139-f87799aa88b9","Type":"ContainerStarted","Data":"1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be"} Feb 17 15:13:29 crc kubenswrapper[4717]: I0217 15:13:29.825918 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cb760d7-52d5-42d6-9139-f87799aa88b9","Type":"ContainerStarted","Data":"ac459812fa123f127ba2b13703eafee5469f551351db00806adf895971e35456"} Feb 17 15:13:29 crc kubenswrapper[4717]: I0217 15:13:29.861003 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d" path="/var/lib/kubelet/pods/7c93d1cf-f6e2-4e06-8534-dca9b9e29e9d/volumes" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.359872 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.543807 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-combined-ca-bundle\") pod \"2692b981-8aba-4b0e-b25c-d53a5846e272\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.544132 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-scripts\") pod \"2692b981-8aba-4b0e-b25c-d53a5846e272\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.544214 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr42p\" (UniqueName: \"kubernetes.io/projected/2692b981-8aba-4b0e-b25c-d53a5846e272-kube-api-access-lr42p\") pod \"2692b981-8aba-4b0e-b25c-d53a5846e272\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.544268 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-config-data\") pod \"2692b981-8aba-4b0e-b25c-d53a5846e272\" (UID: \"2692b981-8aba-4b0e-b25c-d53a5846e272\") " Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.548941 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2692b981-8aba-4b0e-b25c-d53a5846e272-kube-api-access-lr42p" (OuterVolumeSpecName: "kube-api-access-lr42p") pod "2692b981-8aba-4b0e-b25c-d53a5846e272" (UID: "2692b981-8aba-4b0e-b25c-d53a5846e272"). InnerVolumeSpecName "kube-api-access-lr42p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.549073 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-scripts" (OuterVolumeSpecName: "scripts") pod "2692b981-8aba-4b0e-b25c-d53a5846e272" (UID: "2692b981-8aba-4b0e-b25c-d53a5846e272"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.584329 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-config-data" (OuterVolumeSpecName: "config-data") pod "2692b981-8aba-4b0e-b25c-d53a5846e272" (UID: "2692b981-8aba-4b0e-b25c-d53a5846e272"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.597345 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2692b981-8aba-4b0e-b25c-d53a5846e272" (UID: "2692b981-8aba-4b0e-b25c-d53a5846e272"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.646443 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.646771 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr42p\" (UniqueName: \"kubernetes.io/projected/2692b981-8aba-4b0e-b25c-d53a5846e272-kube-api-access-lr42p\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.646810 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.646820 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2692b981-8aba-4b0e-b25c-d53a5846e272-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.834451 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-24w4v" event={"ID":"2692b981-8aba-4b0e-b25c-d53a5846e272","Type":"ContainerDied","Data":"a68c1384de7033bef50650732b1daa97e7ffedf10b6f10ed32015d813ee462ad"} Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.834488 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a68c1384de7033bef50650732b1daa97e7ffedf10b6f10ed32015d813ee462ad" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.834536 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-24w4v" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.840678 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cb760d7-52d5-42d6-9139-f87799aa88b9","Type":"ContainerStarted","Data":"4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6"} Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.948153 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 15:13:30 crc kubenswrapper[4717]: E0217 15:13:30.956848 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2692b981-8aba-4b0e-b25c-d53a5846e272" containerName="nova-cell0-conductor-db-sync" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.956878 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2692b981-8aba-4b0e-b25c-d53a5846e272" containerName="nova-cell0-conductor-db-sync" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.957233 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2692b981-8aba-4b0e-b25c-d53a5846e272" containerName="nova-cell0-conductor-db-sync" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.958038 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.961614 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.961696 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 15:13:30 crc kubenswrapper[4717]: I0217 15:13:30.965907 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-47zrf" Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.055668 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f24db6-956b-479c-88b5-283ae2b17f4d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"18f24db6-956b-479c-88b5-283ae2b17f4d\") " pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.056127 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ljpp\" (UniqueName: \"kubernetes.io/projected/18f24db6-956b-479c-88b5-283ae2b17f4d-kube-api-access-4ljpp\") pod \"nova-cell0-conductor-0\" (UID: \"18f24db6-956b-479c-88b5-283ae2b17f4d\") " pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.056206 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f24db6-956b-479c-88b5-283ae2b17f4d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"18f24db6-956b-479c-88b5-283ae2b17f4d\") " pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.158546 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f24db6-956b-479c-88b5-283ae2b17f4d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"18f24db6-956b-479c-88b5-283ae2b17f4d\") " pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.158620 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ljpp\" (UniqueName: \"kubernetes.io/projected/18f24db6-956b-479c-88b5-283ae2b17f4d-kube-api-access-4ljpp\") pod \"nova-cell0-conductor-0\" (UID: \"18f24db6-956b-479c-88b5-283ae2b17f4d\") " pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.158682 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f24db6-956b-479c-88b5-283ae2b17f4d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"18f24db6-956b-479c-88b5-283ae2b17f4d\") " pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.164914 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f24db6-956b-479c-88b5-283ae2b17f4d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"18f24db6-956b-479c-88b5-283ae2b17f4d\") " pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.165044 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f24db6-956b-479c-88b5-283ae2b17f4d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"18f24db6-956b-479c-88b5-283ae2b17f4d\") " pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.174046 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ljpp\" (UniqueName: \"kubernetes.io/projected/18f24db6-956b-479c-88b5-283ae2b17f4d-kube-api-access-4ljpp\") pod \"nova-cell0-conductor-0\" (UID: \"18f24db6-956b-479c-88b5-283ae2b17f4d\") " pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.285071 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.826404 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.856641 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"18f24db6-956b-479c-88b5-283ae2b17f4d","Type":"ContainerStarted","Data":"4adeac875a9c050125a0aad8cf1ae88112fa3fcfae25e5e19730717fc3d876e2"} Feb 17 15:13:31 crc kubenswrapper[4717]: I0217 15:13:31.858610 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cb760d7-52d5-42d6-9139-f87799aa88b9","Type":"ContainerStarted","Data":"43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0"} Feb 17 15:13:32 crc kubenswrapper[4717]: I0217 15:13:32.867955 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cb760d7-52d5-42d6-9139-f87799aa88b9","Type":"ContainerStarted","Data":"13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb"} Feb 17 15:13:32 crc kubenswrapper[4717]: I0217 15:13:32.868592 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 15:13:32 crc kubenswrapper[4717]: I0217 15:13:32.870003 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"18f24db6-956b-479c-88b5-283ae2b17f4d","Type":"ContainerStarted","Data":"2c7f4e832bf4b5151c4638480610b48162bcc4f3a5db4e3feb5009c92eccbdda"} Feb 17 15:13:32 crc kubenswrapper[4717]: I0217 15:13:32.870784 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:32 crc kubenswrapper[4717]: I0217 15:13:32.903272 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.329000841 podStartE2EDuration="5.903251574s" podCreationTimestamp="2026-02-17 15:13:27 +0000 UTC" firstStartedPulling="2026-02-17 15:13:28.824865547 +0000 UTC m=+1275.240706023" lastFinishedPulling="2026-02-17 15:13:32.39911628 +0000 UTC m=+1278.814956756" observedRunningTime="2026-02-17 15:13:32.900875317 +0000 UTC m=+1279.316715813" watchObservedRunningTime="2026-02-17 15:13:32.903251574 +0000 UTC m=+1279.319092050" Feb 17 15:13:32 crc kubenswrapper[4717]: I0217 15:13:32.923450 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.923429397 podStartE2EDuration="2.923429397s" podCreationTimestamp="2026-02-17 15:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:13:32.914206505 +0000 UTC m=+1279.330046971" watchObservedRunningTime="2026-02-17 15:13:32.923429397 +0000 UTC m=+1279.339269873" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.330146 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.779935 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-4q8gf"] Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.781480 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.789020 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.790229 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.796220 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4q8gf"] Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.892195 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-scripts\") pod \"nova-cell0-cell-mapping-4q8gf\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.892319 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf4gl\" (UniqueName: \"kubernetes.io/projected/d9ed346a-8b4f-464f-8035-73f75ad5e83f-kube-api-access-vf4gl\") pod \"nova-cell0-cell-mapping-4q8gf\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.892343 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4q8gf\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.892383 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-config-data\") pod \"nova-cell0-cell-mapping-4q8gf\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.979030 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.980059 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.987465 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.994432 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf4gl\" (UniqueName: \"kubernetes.io/projected/d9ed346a-8b4f-464f-8035-73f75ad5e83f-kube-api-access-vf4gl\") pod \"nova-cell0-cell-mapping-4q8gf\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.994476 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4q8gf\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.994563 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-config-data\") pod \"nova-cell0-cell-mapping-4q8gf\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:36 crc kubenswrapper[4717]: I0217 15:13:36.994678 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-scripts\") pod \"nova-cell0-cell-mapping-4q8gf\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.003743 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4q8gf\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.004260 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-config-data\") pod \"nova-cell0-cell-mapping-4q8gf\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.006453 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-scripts\") pod \"nova-cell0-cell-mapping-4q8gf\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.009359 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.024947 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.026733 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.032451 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.050152 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf4gl\" (UniqueName: \"kubernetes.io/projected/d9ed346a-8b4f-464f-8035-73f75ad5e83f-kube-api-access-vf4gl\") pod \"nova-cell0-cell-mapping-4q8gf\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.076142 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.096129 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkz8f\" (UniqueName: \"kubernetes.io/projected/e21a925c-a968-4b57-8e0c-941b4a403a2b-kube-api-access-tkz8f\") pod \"nova-cell1-novncproxy-0\" (UID: \"e21a925c-a968-4b57-8e0c-941b4a403a2b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.096221 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21a925c-a968-4b57-8e0c-941b4a403a2b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e21a925c-a968-4b57-8e0c-941b4a403a2b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.096313 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21a925c-a968-4b57-8e0c-941b4a403a2b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e21a925c-a968-4b57-8e0c-941b4a403a2b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.109710 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.155267 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-gcs9s"] Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.191754 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.260207 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-gcs9s"] Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.313950 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882f425e-0f60-444b-af1a-688a91dcbd36-config-data\") pod \"nova-metadata-0\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.314057 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882f425e-0f60-444b-af1a-688a91dcbd36-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.314100 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21a925c-a968-4b57-8e0c-941b4a403a2b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e21a925c-a968-4b57-8e0c-941b4a403a2b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.314338 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b78wv\" (UniqueName: \"kubernetes.io/projected/882f425e-0f60-444b-af1a-688a91dcbd36-kube-api-access-b78wv\") pod \"nova-metadata-0\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.314613 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21a925c-a968-4b57-8e0c-941b4a403a2b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e21a925c-a968-4b57-8e0c-941b4a403a2b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.315217 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkz8f\" (UniqueName: \"kubernetes.io/projected/e21a925c-a968-4b57-8e0c-941b4a403a2b-kube-api-access-tkz8f\") pod \"nova-cell1-novncproxy-0\" (UID: \"e21a925c-a968-4b57-8e0c-941b4a403a2b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.315308 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/882f425e-0f60-444b-af1a-688a91dcbd36-logs\") pod \"nova-metadata-0\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.328871 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21a925c-a968-4b57-8e0c-941b4a403a2b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e21a925c-a968-4b57-8e0c-941b4a403a2b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.353073 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21a925c-a968-4b57-8e0c-941b4a403a2b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e21a925c-a968-4b57-8e0c-941b4a403a2b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.354893 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkz8f\" (UniqueName: \"kubernetes.io/projected/e21a925c-a968-4b57-8e0c-941b4a403a2b-kube-api-access-tkz8f\") pod \"nova-cell1-novncproxy-0\" (UID: \"e21a925c-a968-4b57-8e0c-941b4a403a2b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.357509 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.359995 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.363493 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.388150 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.402331 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.411544 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.417124 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419049 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419389 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882f425e-0f60-444b-af1a-688a91dcbd36-config-data\") pod \"nova-metadata-0\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419441 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419471 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882f425e-0f60-444b-af1a-688a91dcbd36-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419488 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cldb\" (UniqueName: \"kubernetes.io/projected/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-kube-api-access-5cldb\") pod \"nova-api-0\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419507 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghqtr\" (UniqueName: \"kubernetes.io/projected/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-kube-api-access-ghqtr\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419532 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419555 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-config-data\") pod \"nova-api-0\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419595 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-config\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419614 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419628 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419643 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-logs\") pod \"nova-api-0\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419676 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b78wv\" (UniqueName: \"kubernetes.io/projected/882f425e-0f60-444b-af1a-688a91dcbd36-kube-api-access-b78wv\") pod \"nova-metadata-0\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419739 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.419758 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/882f425e-0f60-444b-af1a-688a91dcbd36-logs\") pod \"nova-metadata-0\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.420199 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/882f425e-0f60-444b-af1a-688a91dcbd36-logs\") pod \"nova-metadata-0\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.424434 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882f425e-0f60-444b-af1a-688a91dcbd36-config-data\") pod \"nova-metadata-0\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.425277 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882f425e-0f60-444b-af1a-688a91dcbd36-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.451702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b78wv\" (UniqueName: \"kubernetes.io/projected/882f425e-0f60-444b-af1a-688a91dcbd36-kube-api-access-b78wv\") pod \"nova-metadata-0\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.458125 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.473756 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.525195 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cldb\" (UniqueName: \"kubernetes.io/projected/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-kube-api-access-5cldb\") pod \"nova-api-0\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.525323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghqtr\" (UniqueName: \"kubernetes.io/projected/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-kube-api-access-ghqtr\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.525397 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk5x2\" (UniqueName: \"kubernetes.io/projected/6cb29e62-e185-4ccf-beb0-763a176ae07f-kube-api-access-sk5x2\") pod \"nova-scheduler-0\" (UID: \"6cb29e62-e185-4ccf-beb0-763a176ae07f\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.525459 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.525558 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-config-data\") pod \"nova-api-0\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.525607 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb29e62-e185-4ccf-beb0-763a176ae07f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6cb29e62-e185-4ccf-beb0-763a176ae07f\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.525706 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-config\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.525754 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.525773 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.525796 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-logs\") pod \"nova-api-0\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.526063 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.526147 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb29e62-e185-4ccf-beb0-763a176ae07f-config-data\") pod \"nova-scheduler-0\" (UID: \"6cb29e62-e185-4ccf-beb0-763a176ae07f\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.526244 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.526427 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.527090 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-logs\") pod \"nova-api-0\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.527387 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-config\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.527711 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.527789 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.528861 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.540716 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.542660 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-config-data\") pod \"nova-api-0\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.548164 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghqtr\" (UniqueName: \"kubernetes.io/projected/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-kube-api-access-ghqtr\") pod \"dnsmasq-dns-845d6d6f59-gcs9s\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.554347 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cldb\" (UniqueName: \"kubernetes.io/projected/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-kube-api-access-5cldb\") pod \"nova-api-0\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.584888 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.589258 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.627201 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb29e62-e185-4ccf-beb0-763a176ae07f-config-data\") pod \"nova-scheduler-0\" (UID: \"6cb29e62-e185-4ccf-beb0-763a176ae07f\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.627364 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk5x2\" (UniqueName: \"kubernetes.io/projected/6cb29e62-e185-4ccf-beb0-763a176ae07f-kube-api-access-sk5x2\") pod \"nova-scheduler-0\" (UID: \"6cb29e62-e185-4ccf-beb0-763a176ae07f\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.627456 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb29e62-e185-4ccf-beb0-763a176ae07f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6cb29e62-e185-4ccf-beb0-763a176ae07f\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.648435 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb29e62-e185-4ccf-beb0-763a176ae07f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6cb29e62-e185-4ccf-beb0-763a176ae07f\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.653254 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb29e62-e185-4ccf-beb0-763a176ae07f-config-data\") pod \"nova-scheduler-0\" (UID: \"6cb29e62-e185-4ccf-beb0-763a176ae07f\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.657914 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk5x2\" (UniqueName: \"kubernetes.io/projected/6cb29e62-e185-4ccf-beb0-763a176ae07f-kube-api-access-sk5x2\") pod \"nova-scheduler-0\" (UID: \"6cb29e62-e185-4ccf-beb0-763a176ae07f\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:37 crc kubenswrapper[4717]: I0217 15:13:37.863244 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.190866 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-68ckm"] Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.192744 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.197583 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.197782 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.220817 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4q8gf"] Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.232322 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-68ckm"] Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.357526 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-config-data\") pod \"nova-cell1-conductor-db-sync-68ckm\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.357654 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-68ckm\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.357688 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-scripts\") pod \"nova-cell1-conductor-db-sync-68ckm\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.357804 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdht9\" (UniqueName: \"kubernetes.io/projected/55362b6c-9d2e-4df5-887d-3955d617c166-kube-api-access-vdht9\") pod \"nova-cell1-conductor-db-sync-68ckm\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.430430 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.462416 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-config-data\") pod \"nova-cell1-conductor-db-sync-68ckm\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.462510 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-68ckm\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.462530 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-scripts\") pod \"nova-cell1-conductor-db-sync-68ckm\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.462598 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdht9\" (UniqueName: \"kubernetes.io/projected/55362b6c-9d2e-4df5-887d-3955d617c166-kube-api-access-vdht9\") pod \"nova-cell1-conductor-db-sync-68ckm\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.485965 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.557426 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-68ckm\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.558587 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-config-data\") pod \"nova-cell1-conductor-db-sync-68ckm\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.561517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdht9\" (UniqueName: \"kubernetes.io/projected/55362b6c-9d2e-4df5-887d-3955d617c166-kube-api-access-vdht9\") pod \"nova-cell1-conductor-db-sync-68ckm\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.567745 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-scripts\") pod \"nova-cell1-conductor-db-sync-68ckm\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.675763 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.682974 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-gcs9s"] Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.704180 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:13:38 crc kubenswrapper[4717]: W0217 15:13:38.743394 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod336126bd_b5e1_4870_8a9f_9d2b4a5c7e41.slice/crio-e340b789a9cbe0e9b6587c69dffeb1a6b1d8f0d4e11dd218736e35a11f56f5eb WatchSource:0}: Error finding container e340b789a9cbe0e9b6587c69dffeb1a6b1d8f0d4e11dd218736e35a11f56f5eb: Status 404 returned error can't find the container with id e340b789a9cbe0e9b6587c69dffeb1a6b1d8f0d4e11dd218736e35a11f56f5eb Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.950720 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.988135 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41","Type":"ContainerStarted","Data":"e340b789a9cbe0e9b6587c69dffeb1a6b1d8f0d4e11dd218736e35a11f56f5eb"} Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.994159 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6cb29e62-e185-4ccf-beb0-763a176ae07f","Type":"ContainerStarted","Data":"35babb2472901e195c66cb5c8c4444747dc2fc7f8fdad4a3a4f1c5211459c427"} Feb 17 15:13:38 crc kubenswrapper[4717]: I0217 15:13:38.995434 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e21a925c-a968-4b57-8e0c-941b4a403a2b","Type":"ContainerStarted","Data":"c824311c97c6194f6d2e114954006d7c2234e7f8626f4a4a4f87863d32fa4cd8"} Feb 17 15:13:39 crc kubenswrapper[4717]: I0217 15:13:39.017364 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" event={"ID":"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5","Type":"ContainerStarted","Data":"04ccbf748503fe77571c38f8134237bee9e0f2905a43e1affebc3d74fbd01408"} Feb 17 15:13:39 crc kubenswrapper[4717]: I0217 15:13:39.021649 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"882f425e-0f60-444b-af1a-688a91dcbd36","Type":"ContainerStarted","Data":"63f09265e6385fc77feaf1691afa847752066c48d7099894825dc3fd12c67a35"} Feb 17 15:13:39 crc kubenswrapper[4717]: I0217 15:13:39.023035 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4q8gf" event={"ID":"d9ed346a-8b4f-464f-8035-73f75ad5e83f","Type":"ContainerStarted","Data":"8448081ee7799d961d0b56d916c4f02569f02a25b1507c1aacfd35e596dee9d6"} Feb 17 15:13:39 crc kubenswrapper[4717]: I0217 15:13:39.023059 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4q8gf" event={"ID":"d9ed346a-8b4f-464f-8035-73f75ad5e83f","Type":"ContainerStarted","Data":"1c6106f1d6b3106fd2125e890c18b8e34830e1aa3252cc1e90601cd42a4a12ec"} Feb 17 15:13:39 crc kubenswrapper[4717]: I0217 15:13:39.047483 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-4q8gf" podStartSLOduration=3.047465064 podStartE2EDuration="3.047465064s" podCreationTimestamp="2026-02-17 15:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:13:39.045450566 +0000 UTC m=+1285.461291062" watchObservedRunningTime="2026-02-17 15:13:39.047465064 +0000 UTC m=+1285.463305540" Feb 17 15:13:39 crc kubenswrapper[4717]: I0217 15:13:39.178572 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-68ckm"] Feb 17 15:13:40 crc kubenswrapper[4717]: I0217 15:13:40.035335 4717 generic.go:334] "Generic (PLEG): container finished" podID="8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" containerID="635011d10084309b67449de60d2f160275fd59df48d3ec1d9679356789362eac" exitCode=0 Feb 17 15:13:40 crc kubenswrapper[4717]: I0217 15:13:40.035543 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" event={"ID":"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5","Type":"ContainerDied","Data":"635011d10084309b67449de60d2f160275fd59df48d3ec1d9679356789362eac"} Feb 17 15:13:40 crc kubenswrapper[4717]: I0217 15:13:40.038111 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-68ckm" event={"ID":"55362b6c-9d2e-4df5-887d-3955d617c166","Type":"ContainerStarted","Data":"ef9639468077aac0cb62b975f12d75e6bc1ad3fa6c8c6e7a33b6a043a4a20a09"} Feb 17 15:13:40 crc kubenswrapper[4717]: I0217 15:13:40.038165 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-68ckm" event={"ID":"55362b6c-9d2e-4df5-887d-3955d617c166","Type":"ContainerStarted","Data":"23d775f20ea0aa0950f0825b2ec3adedf50dadf63c5c6914374d872619abfc05"} Feb 17 15:13:40 crc kubenswrapper[4717]: I0217 15:13:40.075027 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-68ckm" podStartSLOduration=2.07500767 podStartE2EDuration="2.07500767s" podCreationTimestamp="2026-02-17 15:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:13:40.06899263 +0000 UTC m=+1286.484833106" watchObservedRunningTime="2026-02-17 15:13:40.07500767 +0000 UTC m=+1286.490848146" Feb 17 15:13:40 crc kubenswrapper[4717]: I0217 15:13:40.555318 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:40 crc kubenswrapper[4717]: I0217 15:13:40.581479 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 15:13:43 crc kubenswrapper[4717]: I0217 15:13:43.083458 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41","Type":"ContainerStarted","Data":"4a8943bb8a1f1503ea2d27b3cda586d2987796069eaec1f34a9c5b5b471b87c8"} Feb 17 15:13:43 crc kubenswrapper[4717]: I0217 15:13:43.089340 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6cb29e62-e185-4ccf-beb0-763a176ae07f","Type":"ContainerStarted","Data":"5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e"} Feb 17 15:13:43 crc kubenswrapper[4717]: I0217 15:13:43.091870 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e21a925c-a968-4b57-8e0c-941b4a403a2b","Type":"ContainerStarted","Data":"75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930"} Feb 17 15:13:43 crc kubenswrapper[4717]: I0217 15:13:43.092036 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e21a925c-a968-4b57-8e0c-941b4a403a2b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930" gracePeriod=30 Feb 17 15:13:43 crc kubenswrapper[4717]: I0217 15:13:43.096201 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" event={"ID":"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5","Type":"ContainerStarted","Data":"78b83d7a05fc7491d26d0f183c14475d542e4f23c36f1d2b6f518e567e67f749"} Feb 17 15:13:43 crc kubenswrapper[4717]: I0217 15:13:43.096298 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:43 crc kubenswrapper[4717]: I0217 15:13:43.099664 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"882f425e-0f60-444b-af1a-688a91dcbd36","Type":"ContainerStarted","Data":"9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0"} Feb 17 15:13:43 crc kubenswrapper[4717]: I0217 15:13:43.119714 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.526152677 podStartE2EDuration="6.119694958s" podCreationTimestamp="2026-02-17 15:13:37 +0000 UTC" firstStartedPulling="2026-02-17 15:13:38.952472818 +0000 UTC m=+1285.368313294" lastFinishedPulling="2026-02-17 15:13:42.546015099 +0000 UTC m=+1288.961855575" observedRunningTime="2026-02-17 15:13:43.105555557 +0000 UTC m=+1289.521396043" watchObservedRunningTime="2026-02-17 15:13:43.119694958 +0000 UTC m=+1289.535535434" Feb 17 15:13:43 crc kubenswrapper[4717]: I0217 15:13:43.130330 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" podStartSLOduration=6.130311679 podStartE2EDuration="6.130311679s" podCreationTimestamp="2026-02-17 15:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:13:43.125025579 +0000 UTC m=+1289.540866065" watchObservedRunningTime="2026-02-17 15:13:43.130311679 +0000 UTC m=+1289.546152155" Feb 17 15:13:43 crc kubenswrapper[4717]: I0217 15:13:43.155026 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.186681784 podStartE2EDuration="7.155000629s" podCreationTimestamp="2026-02-17 15:13:36 +0000 UTC" firstStartedPulling="2026-02-17 15:13:38.579863685 +0000 UTC m=+1284.995704161" lastFinishedPulling="2026-02-17 15:13:42.54818253 +0000 UTC m=+1288.964023006" observedRunningTime="2026-02-17 15:13:43.14126713 +0000 UTC m=+1289.557107606" watchObservedRunningTime="2026-02-17 15:13:43.155000629 +0000 UTC m=+1289.570841105" Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.111313 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41","Type":"ContainerStarted","Data":"671bfb3d4d4694d49a4c5cb5dd81ef5546c7bb3d76120ae971358f1d9eb14908"} Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.113461 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"882f425e-0f60-444b-af1a-688a91dcbd36","Type":"ContainerStarted","Data":"6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901"} Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.113834 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="882f425e-0f60-444b-af1a-688a91dcbd36" containerName="nova-metadata-log" containerID="cri-o://9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0" gracePeriod=30 Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.113892 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="882f425e-0f60-444b-af1a-688a91dcbd36" containerName="nova-metadata-metadata" containerID="cri-o://6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901" gracePeriod=30 Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.129636 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.332668003 podStartE2EDuration="7.129613635s" podCreationTimestamp="2026-02-17 15:13:37 +0000 UTC" firstStartedPulling="2026-02-17 15:13:38.747007298 +0000 UTC m=+1285.162847764" lastFinishedPulling="2026-02-17 15:13:42.54395293 +0000 UTC m=+1288.959793396" observedRunningTime="2026-02-17 15:13:44.128284447 +0000 UTC m=+1290.544124933" watchObservedRunningTime="2026-02-17 15:13:44.129613635 +0000 UTC m=+1290.545454111" Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.163430 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.199915146 podStartE2EDuration="8.163410044s" podCreationTimestamp="2026-02-17 15:13:36 +0000 UTC" firstStartedPulling="2026-02-17 15:13:38.579612738 +0000 UTC m=+1284.995453214" lastFinishedPulling="2026-02-17 15:13:42.543107636 +0000 UTC m=+1288.958948112" observedRunningTime="2026-02-17 15:13:44.14636865 +0000 UTC m=+1290.562209146" watchObservedRunningTime="2026-02-17 15:13:44.163410044 +0000 UTC m=+1290.579250520" Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.803838 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.922990 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882f425e-0f60-444b-af1a-688a91dcbd36-config-data\") pod \"882f425e-0f60-444b-af1a-688a91dcbd36\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.923087 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882f425e-0f60-444b-af1a-688a91dcbd36-combined-ca-bundle\") pod \"882f425e-0f60-444b-af1a-688a91dcbd36\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.923152 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/882f425e-0f60-444b-af1a-688a91dcbd36-logs\") pod \"882f425e-0f60-444b-af1a-688a91dcbd36\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.923231 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b78wv\" (UniqueName: \"kubernetes.io/projected/882f425e-0f60-444b-af1a-688a91dcbd36-kube-api-access-b78wv\") pod \"882f425e-0f60-444b-af1a-688a91dcbd36\" (UID: \"882f425e-0f60-444b-af1a-688a91dcbd36\") " Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.925049 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/882f425e-0f60-444b-af1a-688a91dcbd36-logs" (OuterVolumeSpecName: "logs") pod "882f425e-0f60-444b-af1a-688a91dcbd36" (UID: "882f425e-0f60-444b-af1a-688a91dcbd36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.934448 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/882f425e-0f60-444b-af1a-688a91dcbd36-kube-api-access-b78wv" (OuterVolumeSpecName: "kube-api-access-b78wv") pod "882f425e-0f60-444b-af1a-688a91dcbd36" (UID: "882f425e-0f60-444b-af1a-688a91dcbd36"). InnerVolumeSpecName "kube-api-access-b78wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.952916 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/882f425e-0f60-444b-af1a-688a91dcbd36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "882f425e-0f60-444b-af1a-688a91dcbd36" (UID: "882f425e-0f60-444b-af1a-688a91dcbd36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:44 crc kubenswrapper[4717]: I0217 15:13:44.963191 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/882f425e-0f60-444b-af1a-688a91dcbd36-config-data" (OuterVolumeSpecName: "config-data") pod "882f425e-0f60-444b-af1a-688a91dcbd36" (UID: "882f425e-0f60-444b-af1a-688a91dcbd36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.026071 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882f425e-0f60-444b-af1a-688a91dcbd36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.026199 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/882f425e-0f60-444b-af1a-688a91dcbd36-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.026220 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b78wv\" (UniqueName: \"kubernetes.io/projected/882f425e-0f60-444b-af1a-688a91dcbd36-kube-api-access-b78wv\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.026244 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882f425e-0f60-444b-af1a-688a91dcbd36-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.124466 4717 generic.go:334] "Generic (PLEG): container finished" podID="882f425e-0f60-444b-af1a-688a91dcbd36" containerID="6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901" exitCode=0 Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.124494 4717 generic.go:334] "Generic (PLEG): container finished" podID="882f425e-0f60-444b-af1a-688a91dcbd36" containerID="9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0" exitCode=143 Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.124537 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.124525 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"882f425e-0f60-444b-af1a-688a91dcbd36","Type":"ContainerDied","Data":"6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901"} Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.124665 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"882f425e-0f60-444b-af1a-688a91dcbd36","Type":"ContainerDied","Data":"9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0"} Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.124685 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"882f425e-0f60-444b-af1a-688a91dcbd36","Type":"ContainerDied","Data":"63f09265e6385fc77feaf1691afa847752066c48d7099894825dc3fd12c67a35"} Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.124707 4717 scope.go:117] "RemoveContainer" containerID="6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.156386 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.157652 4717 scope.go:117] "RemoveContainer" containerID="9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.176623 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.204025 4717 scope.go:117] "RemoveContainer" containerID="6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901" Feb 17 15:13:45 crc kubenswrapper[4717]: E0217 15:13:45.204680 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901\": container with ID starting with 6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901 not found: ID does not exist" containerID="6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.204721 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901"} err="failed to get container status \"6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901\": rpc error: code = NotFound desc = could not find container \"6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901\": container with ID starting with 6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901 not found: ID does not exist" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.204748 4717 scope.go:117] "RemoveContainer" containerID="9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0" Feb 17 15:13:45 crc kubenswrapper[4717]: E0217 15:13:45.205032 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0\": container with ID starting with 9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0 not found: ID does not exist" containerID="9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.205063 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0"} err="failed to get container status \"9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0\": rpc error: code = NotFound desc = could not find container \"9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0\": container with ID starting with 9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0 not found: ID does not exist" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.205097 4717 scope.go:117] "RemoveContainer" containerID="6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.226817 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:45 crc kubenswrapper[4717]: E0217 15:13:45.227247 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="882f425e-0f60-444b-af1a-688a91dcbd36" containerName="nova-metadata-log" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.227258 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="882f425e-0f60-444b-af1a-688a91dcbd36" containerName="nova-metadata-log" Feb 17 15:13:45 crc kubenswrapper[4717]: E0217 15:13:45.227280 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="882f425e-0f60-444b-af1a-688a91dcbd36" containerName="nova-metadata-metadata" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.227286 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="882f425e-0f60-444b-af1a-688a91dcbd36" containerName="nova-metadata-metadata" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.227485 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="882f425e-0f60-444b-af1a-688a91dcbd36" containerName="nova-metadata-metadata" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.227508 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="882f425e-0f60-444b-af1a-688a91dcbd36" containerName="nova-metadata-log" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.229363 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901"} err="failed to get container status \"6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901\": rpc error: code = NotFound desc = could not find container \"6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901\": container with ID starting with 6a59e16dc71379426f73d7577d6b458cdc13ee4ba3c0d30e2d34ab8da76bf901 not found: ID does not exist" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.229410 4717 scope.go:117] "RemoveContainer" containerID="9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.229602 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.234573 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0"} err="failed to get container status \"9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0\": rpc error: code = NotFound desc = could not find container \"9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0\": container with ID starting with 9ffd93d67095d3139eafa0b6d2d0b69e96cd7bd0e5adae21422d4c821aca1eb0 not found: ID does not exist" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.234936 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.235230 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.238538 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.330541 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7046c349-8f61-4a96-9e70-31dda2cbe6c1-logs\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.330622 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-config-data\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.330698 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.330797 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l75t\" (UniqueName: \"kubernetes.io/projected/7046c349-8f61-4a96-9e70-31dda2cbe6c1-kube-api-access-2l75t\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.330847 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.433053 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.433135 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7046c349-8f61-4a96-9e70-31dda2cbe6c1-logs\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.433192 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-config-data\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.433264 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.433369 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l75t\" (UniqueName: \"kubernetes.io/projected/7046c349-8f61-4a96-9e70-31dda2cbe6c1-kube-api-access-2l75t\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.434408 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7046c349-8f61-4a96-9e70-31dda2cbe6c1-logs\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.438682 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.447518 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-config-data\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.455741 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.471831 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l75t\" (UniqueName: \"kubernetes.io/projected/7046c349-8f61-4a96-9e70-31dda2cbe6c1-kube-api-access-2l75t\") pod \"nova-metadata-0\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.587274 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:13:45 crc kubenswrapper[4717]: I0217 15:13:45.858244 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="882f425e-0f60-444b-af1a-688a91dcbd36" path="/var/lib/kubelet/pods/882f425e-0f60-444b-af1a-688a91dcbd36/volumes" Feb 17 15:13:46 crc kubenswrapper[4717]: I0217 15:13:46.070631 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:46 crc kubenswrapper[4717]: W0217 15:13:46.083253 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7046c349_8f61_4a96_9e70_31dda2cbe6c1.slice/crio-b48646f13f6c7c4f5bed4defdcfe245ff2489071c79772e7d83b513f53cf5b47 WatchSource:0}: Error finding container b48646f13f6c7c4f5bed4defdcfe245ff2489071c79772e7d83b513f53cf5b47: Status 404 returned error can't find the container with id b48646f13f6c7c4f5bed4defdcfe245ff2489071c79772e7d83b513f53cf5b47 Feb 17 15:13:46 crc kubenswrapper[4717]: I0217 15:13:46.140391 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7046c349-8f61-4a96-9e70-31dda2cbe6c1","Type":"ContainerStarted","Data":"b48646f13f6c7c4f5bed4defdcfe245ff2489071c79772e7d83b513f53cf5b47"} Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.159082 4717 generic.go:334] "Generic (PLEG): container finished" podID="d9ed346a-8b4f-464f-8035-73f75ad5e83f" containerID="8448081ee7799d961d0b56d916c4f02569f02a25b1507c1aacfd35e596dee9d6" exitCode=0 Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.159427 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4q8gf" event={"ID":"d9ed346a-8b4f-464f-8035-73f75ad5e83f","Type":"ContainerDied","Data":"8448081ee7799d961d0b56d916c4f02569f02a25b1507c1aacfd35e596dee9d6"} Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.161509 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7046c349-8f61-4a96-9e70-31dda2cbe6c1","Type":"ContainerStarted","Data":"b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1"} Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.161535 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7046c349-8f61-4a96-9e70-31dda2cbe6c1","Type":"ContainerStarted","Data":"619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee"} Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.208134 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.20811893 podStartE2EDuration="2.20811893s" podCreationTimestamp="2026-02-17 15:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:13:47.202925783 +0000 UTC m=+1293.618766299" watchObservedRunningTime="2026-02-17 15:13:47.20811893 +0000 UTC m=+1293.623959406" Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.458772 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.586295 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.586352 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.591304 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.660292 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-mxm6x"] Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.660535 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" podUID="834aebac-9e50-4e80-868c-231373fa208a" containerName="dnsmasq-dns" containerID="cri-o://38261e17bd91695eea2b84a6bd26372cd543d893ec8226ac6dece833d75d680f" gracePeriod=10 Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.864291 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.864750 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 15:13:47 crc kubenswrapper[4717]: I0217 15:13:47.896966 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.174774 4717 generic.go:334] "Generic (PLEG): container finished" podID="834aebac-9e50-4e80-868c-231373fa208a" containerID="38261e17bd91695eea2b84a6bd26372cd543d893ec8226ac6dece833d75d680f" exitCode=0 Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.174877 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" event={"ID":"834aebac-9e50-4e80-868c-231373fa208a","Type":"ContainerDied","Data":"38261e17bd91695eea2b84a6bd26372cd543d893ec8226ac6dece833d75d680f"} Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.229183 4717 generic.go:334] "Generic (PLEG): container finished" podID="55362b6c-9d2e-4df5-887d-3955d617c166" containerID="ef9639468077aac0cb62b975f12d75e6bc1ad3fa6c8c6e7a33b6a043a4a20a09" exitCode=0 Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.229375 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-68ckm" event={"ID":"55362b6c-9d2e-4df5-887d-3955d617c166","Type":"ContainerDied","Data":"ef9639468077aac0cb62b975f12d75e6bc1ad3fa6c8c6e7a33b6a043a4a20a09"} Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.267925 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.301317 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.402174 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-config\") pod \"834aebac-9e50-4e80-868c-231373fa208a\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.402220 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-dns-swift-storage-0\") pod \"834aebac-9e50-4e80-868c-231373fa208a\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.402266 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-dns-svc\") pod \"834aebac-9e50-4e80-868c-231373fa208a\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.402313 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-ovsdbserver-sb\") pod \"834aebac-9e50-4e80-868c-231373fa208a\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.402338 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-ovsdbserver-nb\") pod \"834aebac-9e50-4e80-868c-231373fa208a\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.402428 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfhxm\" (UniqueName: \"kubernetes.io/projected/834aebac-9e50-4e80-868c-231373fa208a-kube-api-access-bfhxm\") pod \"834aebac-9e50-4e80-868c-231373fa208a\" (UID: \"834aebac-9e50-4e80-868c-231373fa208a\") " Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.409283 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/834aebac-9e50-4e80-868c-231373fa208a-kube-api-access-bfhxm" (OuterVolumeSpecName: "kube-api-access-bfhxm") pod "834aebac-9e50-4e80-868c-231373fa208a" (UID: "834aebac-9e50-4e80-868c-231373fa208a"). InnerVolumeSpecName "kube-api-access-bfhxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.482627 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "834aebac-9e50-4e80-868c-231373fa208a" (UID: "834aebac-9e50-4e80-868c-231373fa208a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.493255 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-config" (OuterVolumeSpecName: "config") pod "834aebac-9e50-4e80-868c-231373fa208a" (UID: "834aebac-9e50-4e80-868c-231373fa208a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.495514 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "834aebac-9e50-4e80-868c-231373fa208a" (UID: "834aebac-9e50-4e80-868c-231373fa208a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.506025 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.506050 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.506059 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.506067 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfhxm\" (UniqueName: \"kubernetes.io/projected/834aebac-9e50-4e80-868c-231373fa208a-kube-api-access-bfhxm\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.519949 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "834aebac-9e50-4e80-868c-231373fa208a" (UID: "834aebac-9e50-4e80-868c-231373fa208a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.530637 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "834aebac-9e50-4e80-868c-231373fa208a" (UID: "834aebac-9e50-4e80-868c-231373fa208a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.553554 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.608212 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf4gl\" (UniqueName: \"kubernetes.io/projected/d9ed346a-8b4f-464f-8035-73f75ad5e83f-kube-api-access-vf4gl\") pod \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.608469 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-config-data\") pod \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.608494 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-combined-ca-bundle\") pod \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.608586 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-scripts\") pod \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\" (UID: \"d9ed346a-8b4f-464f-8035-73f75ad5e83f\") " Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.609092 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.609120 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/834aebac-9e50-4e80-868c-231373fa208a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.612834 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-scripts" (OuterVolumeSpecName: "scripts") pod "d9ed346a-8b4f-464f-8035-73f75ad5e83f" (UID: "d9ed346a-8b4f-464f-8035-73f75ad5e83f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.614610 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ed346a-8b4f-464f-8035-73f75ad5e83f-kube-api-access-vf4gl" (OuterVolumeSpecName: "kube-api-access-vf4gl") pod "d9ed346a-8b4f-464f-8035-73f75ad5e83f" (UID: "d9ed346a-8b4f-464f-8035-73f75ad5e83f"). InnerVolumeSpecName "kube-api-access-vf4gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.641256 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9ed346a-8b4f-464f-8035-73f75ad5e83f" (UID: "d9ed346a-8b4f-464f-8035-73f75ad5e83f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.647857 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-config-data" (OuterVolumeSpecName: "config-data") pod "d9ed346a-8b4f-464f-8035-73f75ad5e83f" (UID: "d9ed346a-8b4f-464f-8035-73f75ad5e83f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.668309 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.668346 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.711318 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.711358 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf4gl\" (UniqueName: \"kubernetes.io/projected/d9ed346a-8b4f-464f-8035-73f75ad5e83f-kube-api-access-vf4gl\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.711373 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:48 crc kubenswrapper[4717]: I0217 15:13:48.711384 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ed346a-8b4f-464f-8035-73f75ad5e83f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.238345 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" event={"ID":"834aebac-9e50-4e80-868c-231373fa208a","Type":"ContainerDied","Data":"4e1a95c59e5e2e34b2e19b3ee2ed9ed07a64526da44b3ae78fed23154096465a"} Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.238392 4717 scope.go:117] "RemoveContainer" containerID="38261e17bd91695eea2b84a6bd26372cd543d893ec8226ac6dece833d75d680f" Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.238521 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-mxm6x" Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.242390 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4q8gf" Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.248705 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4q8gf" event={"ID":"d9ed346a-8b4f-464f-8035-73f75ad5e83f","Type":"ContainerDied","Data":"1c6106f1d6b3106fd2125e890c18b8e34830e1aa3252cc1e90601cd42a4a12ec"} Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.248742 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c6106f1d6b3106fd2125e890c18b8e34830e1aa3252cc1e90601cd42a4a12ec" Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.313141 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-mxm6x"] Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.315949 4717 scope.go:117] "RemoveContainer" containerID="c0ea11c31a144ba5f4b8c9c3ef241b602ae527ce24eb44b430fd30cc4ca68c6f" Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.348526 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-mxm6x"] Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.512959 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.513260 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" containerName="nova-api-log" containerID="cri-o://4a8943bb8a1f1503ea2d27b3cda586d2987796069eaec1f34a9c5b5b471b87c8" gracePeriod=30 Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.513764 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" containerName="nova-api-api" containerID="cri-o://671bfb3d4d4694d49a4c5cb5dd81ef5546c7bb3d76120ae971358f1d9eb14908" gracePeriod=30 Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.531786 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.654693 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.655025 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7046c349-8f61-4a96-9e70-31dda2cbe6c1" containerName="nova-metadata-log" containerID="cri-o://619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee" gracePeriod=30 Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.655410 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7046c349-8f61-4a96-9e70-31dda2cbe6c1" containerName="nova-metadata-metadata" containerID="cri-o://b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1" gracePeriod=30 Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.864341 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="834aebac-9e50-4e80-868c-231373fa208a" path="/var/lib/kubelet/pods/834aebac-9e50-4e80-868c-231373fa208a/volumes" Feb 17 15:13:49 crc kubenswrapper[4717]: I0217 15:13:49.942218 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.038178 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-config-data\") pod \"55362b6c-9d2e-4df5-887d-3955d617c166\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.038259 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-combined-ca-bundle\") pod \"55362b6c-9d2e-4df5-887d-3955d617c166\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.038553 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdht9\" (UniqueName: \"kubernetes.io/projected/55362b6c-9d2e-4df5-887d-3955d617c166-kube-api-access-vdht9\") pod \"55362b6c-9d2e-4df5-887d-3955d617c166\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.038662 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-scripts\") pod \"55362b6c-9d2e-4df5-887d-3955d617c166\" (UID: \"55362b6c-9d2e-4df5-887d-3955d617c166\") " Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.044218 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55362b6c-9d2e-4df5-887d-3955d617c166-kube-api-access-vdht9" (OuterVolumeSpecName: "kube-api-access-vdht9") pod "55362b6c-9d2e-4df5-887d-3955d617c166" (UID: "55362b6c-9d2e-4df5-887d-3955d617c166"). InnerVolumeSpecName "kube-api-access-vdht9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.045042 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-scripts" (OuterVolumeSpecName: "scripts") pod "55362b6c-9d2e-4df5-887d-3955d617c166" (UID: "55362b6c-9d2e-4df5-887d-3955d617c166"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.090222 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55362b6c-9d2e-4df5-887d-3955d617c166" (UID: "55362b6c-9d2e-4df5-887d-3955d617c166"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.090284 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-config-data" (OuterVolumeSpecName: "config-data") pod "55362b6c-9d2e-4df5-887d-3955d617c166" (UID: "55362b6c-9d2e-4df5-887d-3955d617c166"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.143698 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdht9\" (UniqueName: \"kubernetes.io/projected/55362b6c-9d2e-4df5-887d-3955d617c166-kube-api-access-vdht9\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.143749 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.143762 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.143776 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55362b6c-9d2e-4df5-887d-3955d617c166-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.286441 4717 generic.go:334] "Generic (PLEG): container finished" podID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" containerID="4a8943bb8a1f1503ea2d27b3cda586d2987796069eaec1f34a9c5b5b471b87c8" exitCode=143 Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.287923 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41","Type":"ContainerDied","Data":"4a8943bb8a1f1503ea2d27b3cda586d2987796069eaec1f34a9c5b5b471b87c8"} Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.290070 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.291709 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-68ckm" event={"ID":"55362b6c-9d2e-4df5-887d-3955d617c166","Type":"ContainerDied","Data":"23d775f20ea0aa0950f0825b2ec3adedf50dadf63c5c6914374d872619abfc05"} Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.291750 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23d775f20ea0aa0950f0825b2ec3adedf50dadf63c5c6914374d872619abfc05" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.291799 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-68ckm" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.308595 4717 generic.go:334] "Generic (PLEG): container finished" podID="7046c349-8f61-4a96-9e70-31dda2cbe6c1" containerID="b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1" exitCode=0 Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.308627 4717 generic.go:334] "Generic (PLEG): container finished" podID="7046c349-8f61-4a96-9e70-31dda2cbe6c1" containerID="619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee" exitCode=143 Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.308696 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7046c349-8f61-4a96-9e70-31dda2cbe6c1","Type":"ContainerDied","Data":"b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1"} Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.308727 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7046c349-8f61-4a96-9e70-31dda2cbe6c1","Type":"ContainerDied","Data":"619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee"} Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.308740 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7046c349-8f61-4a96-9e70-31dda2cbe6c1","Type":"ContainerDied","Data":"b48646f13f6c7c4f5bed4defdcfe245ff2489071c79772e7d83b513f53cf5b47"} Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.308755 4717 scope.go:117] "RemoveContainer" containerID="b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.308865 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.314958 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6cb29e62-e185-4ccf-beb0-763a176ae07f" containerName="nova-scheduler-scheduler" containerID="cri-o://5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e" gracePeriod=30 Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.339181 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 15:13:50 crc kubenswrapper[4717]: E0217 15:13:50.339733 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7046c349-8f61-4a96-9e70-31dda2cbe6c1" containerName="nova-metadata-metadata" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.339751 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7046c349-8f61-4a96-9e70-31dda2cbe6c1" containerName="nova-metadata-metadata" Feb 17 15:13:50 crc kubenswrapper[4717]: E0217 15:13:50.339768 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ed346a-8b4f-464f-8035-73f75ad5e83f" containerName="nova-manage" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.339774 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ed346a-8b4f-464f-8035-73f75ad5e83f" containerName="nova-manage" Feb 17 15:13:50 crc kubenswrapper[4717]: E0217 15:13:50.339798 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7046c349-8f61-4a96-9e70-31dda2cbe6c1" containerName="nova-metadata-log" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.339804 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7046c349-8f61-4a96-9e70-31dda2cbe6c1" containerName="nova-metadata-log" Feb 17 15:13:50 crc kubenswrapper[4717]: E0217 15:13:50.339813 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834aebac-9e50-4e80-868c-231373fa208a" containerName="init" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.339819 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="834aebac-9e50-4e80-868c-231373fa208a" containerName="init" Feb 17 15:13:50 crc kubenswrapper[4717]: E0217 15:13:50.339828 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834aebac-9e50-4e80-868c-231373fa208a" containerName="dnsmasq-dns" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.339834 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="834aebac-9e50-4e80-868c-231373fa208a" containerName="dnsmasq-dns" Feb 17 15:13:50 crc kubenswrapper[4717]: E0217 15:13:50.339845 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55362b6c-9d2e-4df5-887d-3955d617c166" containerName="nova-cell1-conductor-db-sync" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.339852 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="55362b6c-9d2e-4df5-887d-3955d617c166" containerName="nova-cell1-conductor-db-sync" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.352577 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="834aebac-9e50-4e80-868c-231373fa208a" containerName="dnsmasq-dns" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.352608 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-nova-metadata-tls-certs\") pod \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.352676 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7046c349-8f61-4a96-9e70-31dda2cbe6c1-logs\") pod \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.352680 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7046c349-8f61-4a96-9e70-31dda2cbe6c1" containerName="nova-metadata-log" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.352707 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7046c349-8f61-4a96-9e70-31dda2cbe6c1" containerName="nova-metadata-metadata" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.352734 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ed346a-8b4f-464f-8035-73f75ad5e83f" containerName="nova-manage" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.352737 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-combined-ca-bundle\") pod \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.352779 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-config-data\") pod \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.352950 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l75t\" (UniqueName: \"kubernetes.io/projected/7046c349-8f61-4a96-9e70-31dda2cbe6c1-kube-api-access-2l75t\") pod \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\" (UID: \"7046c349-8f61-4a96-9e70-31dda2cbe6c1\") " Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.352742 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="55362b6c-9d2e-4df5-887d-3955d617c166" containerName="nova-cell1-conductor-db-sync" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.355128 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7046c349-8f61-4a96-9e70-31dda2cbe6c1-logs" (OuterVolumeSpecName: "logs") pod "7046c349-8f61-4a96-9e70-31dda2cbe6c1" (UID: "7046c349-8f61-4a96-9e70-31dda2cbe6c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.356017 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.361605 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7046c349-8f61-4a96-9e70-31dda2cbe6c1-kube-api-access-2l75t" (OuterVolumeSpecName: "kube-api-access-2l75t") pod "7046c349-8f61-4a96-9e70-31dda2cbe6c1" (UID: "7046c349-8f61-4a96-9e70-31dda2cbe6c1"). InnerVolumeSpecName "kube-api-access-2l75t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.364452 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.365613 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.400594 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7046c349-8f61-4a96-9e70-31dda2cbe6c1" (UID: "7046c349-8f61-4a96-9e70-31dda2cbe6c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.405641 4717 scope.go:117] "RemoveContainer" containerID="619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.406387 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-config-data" (OuterVolumeSpecName: "config-data") pod "7046c349-8f61-4a96-9e70-31dda2cbe6c1" (UID: "7046c349-8f61-4a96-9e70-31dda2cbe6c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.428934 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7046c349-8f61-4a96-9e70-31dda2cbe6c1" (UID: "7046c349-8f61-4a96-9e70-31dda2cbe6c1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.458242 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/731c9760-bb30-4dd0-b246-0fb9ed312ae9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"731c9760-bb30-4dd0-b246-0fb9ed312ae9\") " pod="openstack/nova-cell1-conductor-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.458391 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731c9760-bb30-4dd0-b246-0fb9ed312ae9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"731c9760-bb30-4dd0-b246-0fb9ed312ae9\") " pod="openstack/nova-cell1-conductor-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.458468 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twzxl\" (UniqueName: \"kubernetes.io/projected/731c9760-bb30-4dd0-b246-0fb9ed312ae9-kube-api-access-twzxl\") pod \"nova-cell1-conductor-0\" (UID: \"731c9760-bb30-4dd0-b246-0fb9ed312ae9\") " pod="openstack/nova-cell1-conductor-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.458548 4717 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.458560 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7046c349-8f61-4a96-9e70-31dda2cbe6c1-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.458569 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.458577 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7046c349-8f61-4a96-9e70-31dda2cbe6c1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.458587 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l75t\" (UniqueName: \"kubernetes.io/projected/7046c349-8f61-4a96-9e70-31dda2cbe6c1-kube-api-access-2l75t\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.461752 4717 scope.go:117] "RemoveContainer" containerID="b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1" Feb 17 15:13:50 crc kubenswrapper[4717]: E0217 15:13:50.462651 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1\": container with ID starting with b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1 not found: ID does not exist" containerID="b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.462690 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1"} err="failed to get container status \"b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1\": rpc error: code = NotFound desc = could not find container \"b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1\": container with ID starting with b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1 not found: ID does not exist" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.462717 4717 scope.go:117] "RemoveContainer" containerID="619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee" Feb 17 15:13:50 crc kubenswrapper[4717]: E0217 15:13:50.467543 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee\": container with ID starting with 619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee not found: ID does not exist" containerID="619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.467586 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee"} err="failed to get container status \"619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee\": rpc error: code = NotFound desc = could not find container \"619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee\": container with ID starting with 619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee not found: ID does not exist" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.467610 4717 scope.go:117] "RemoveContainer" containerID="b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.468518 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1"} err="failed to get container status \"b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1\": rpc error: code = NotFound desc = could not find container \"b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1\": container with ID starting with b9ee27e20a0bb7d9254d9a54e7ba74729ac060573b72517f70d396cf2a4112b1 not found: ID does not exist" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.468597 4717 scope.go:117] "RemoveContainer" containerID="619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.469928 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee"} err="failed to get container status \"619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee\": rpc error: code = NotFound desc = could not find container \"619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee\": container with ID starting with 619bb663b166941bf7aba4d251402c82f44cf14ad00f888af50fb90a94c2d4ee not found: ID does not exist" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.567375 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731c9760-bb30-4dd0-b246-0fb9ed312ae9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"731c9760-bb30-4dd0-b246-0fb9ed312ae9\") " pod="openstack/nova-cell1-conductor-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.567480 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twzxl\" (UniqueName: \"kubernetes.io/projected/731c9760-bb30-4dd0-b246-0fb9ed312ae9-kube-api-access-twzxl\") pod \"nova-cell1-conductor-0\" (UID: \"731c9760-bb30-4dd0-b246-0fb9ed312ae9\") " pod="openstack/nova-cell1-conductor-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.567526 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/731c9760-bb30-4dd0-b246-0fb9ed312ae9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"731c9760-bb30-4dd0-b246-0fb9ed312ae9\") " pod="openstack/nova-cell1-conductor-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.574937 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/731c9760-bb30-4dd0-b246-0fb9ed312ae9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"731c9760-bb30-4dd0-b246-0fb9ed312ae9\") " pod="openstack/nova-cell1-conductor-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.575140 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731c9760-bb30-4dd0-b246-0fb9ed312ae9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"731c9760-bb30-4dd0-b246-0fb9ed312ae9\") " pod="openstack/nova-cell1-conductor-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.584407 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twzxl\" (UniqueName: \"kubernetes.io/projected/731c9760-bb30-4dd0-b246-0fb9ed312ae9-kube-api-access-twzxl\") pod \"nova-cell1-conductor-0\" (UID: \"731c9760-bb30-4dd0-b246-0fb9ed312ae9\") " pod="openstack/nova-cell1-conductor-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.653316 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.674511 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.688465 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.690649 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.694676 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.694986 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.703938 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.715452 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.770633 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea023995-3d37-4a3b-b22b-9903c7e21fc6-logs\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.770702 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zdqz\" (UniqueName: \"kubernetes.io/projected/ea023995-3d37-4a3b-b22b-9903c7e21fc6-kube-api-access-7zdqz\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.770795 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.770843 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-config-data\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.770880 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.872810 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea023995-3d37-4a3b-b22b-9903c7e21fc6-logs\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.873265 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zdqz\" (UniqueName: \"kubernetes.io/projected/ea023995-3d37-4a3b-b22b-9903c7e21fc6-kube-api-access-7zdqz\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.873432 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.873472 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-config-data\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.873500 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.875892 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea023995-3d37-4a3b-b22b-9903c7e21fc6-logs\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.880304 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.882906 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-config-data\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.893042 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:50 crc kubenswrapper[4717]: I0217 15:13:50.897514 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zdqz\" (UniqueName: \"kubernetes.io/projected/ea023995-3d37-4a3b-b22b-9903c7e21fc6-kube-api-access-7zdqz\") pod \"nova-metadata-0\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " pod="openstack/nova-metadata-0" Feb 17 15:13:51 crc kubenswrapper[4717]: I0217 15:13:51.020477 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:13:51 crc kubenswrapper[4717]: I0217 15:13:51.237808 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 15:13:51 crc kubenswrapper[4717]: I0217 15:13:51.332565 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"731c9760-bb30-4dd0-b246-0fb9ed312ae9","Type":"ContainerStarted","Data":"59a3de40cb50cd423e9db96ae71613dc9f1c2702f62efb37926141b8d7e0e71b"} Feb 17 15:13:51 crc kubenswrapper[4717]: W0217 15:13:51.503154 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea023995_3d37_4a3b_b22b_9903c7e21fc6.slice/crio-abaf49433953181dccbe67ad76a14c6506457276c934616840c6133a5ed8dca3 WatchSource:0}: Error finding container abaf49433953181dccbe67ad76a14c6506457276c934616840c6133a5ed8dca3: Status 404 returned error can't find the container with id abaf49433953181dccbe67ad76a14c6506457276c934616840c6133a5ed8dca3 Feb 17 15:13:51 crc kubenswrapper[4717]: I0217 15:13:51.512988 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:13:51 crc kubenswrapper[4717]: I0217 15:13:51.871242 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7046c349-8f61-4a96-9e70-31dda2cbe6c1" path="/var/lib/kubelet/pods/7046c349-8f61-4a96-9e70-31dda2cbe6c1/volumes" Feb 17 15:13:52 crc kubenswrapper[4717]: I0217 15:13:52.345201 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"731c9760-bb30-4dd0-b246-0fb9ed312ae9","Type":"ContainerStarted","Data":"f836c53af473bf397be2ad1d3974b29c9feb63967342ff314068e2ac3004e5f0"} Feb 17 15:13:52 crc kubenswrapper[4717]: I0217 15:13:52.345250 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 17 15:13:52 crc kubenswrapper[4717]: I0217 15:13:52.347250 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ea023995-3d37-4a3b-b22b-9903c7e21fc6","Type":"ContainerStarted","Data":"6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06"} Feb 17 15:13:52 crc kubenswrapper[4717]: I0217 15:13:52.347282 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ea023995-3d37-4a3b-b22b-9903c7e21fc6","Type":"ContainerStarted","Data":"aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384"} Feb 17 15:13:52 crc kubenswrapper[4717]: I0217 15:13:52.347295 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ea023995-3d37-4a3b-b22b-9903c7e21fc6","Type":"ContainerStarted","Data":"abaf49433953181dccbe67ad76a14c6506457276c934616840c6133a5ed8dca3"} Feb 17 15:13:52 crc kubenswrapper[4717]: I0217 15:13:52.373777 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.373754391 podStartE2EDuration="2.373754391s" podCreationTimestamp="2026-02-17 15:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:13:52.363624424 +0000 UTC m=+1298.779464900" watchObservedRunningTime="2026-02-17 15:13:52.373754391 +0000 UTC m=+1298.789594877" Feb 17 15:13:52 crc kubenswrapper[4717]: I0217 15:13:52.411218 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.411199494 podStartE2EDuration="2.411199494s" podCreationTimestamp="2026-02-17 15:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:13:52.408354683 +0000 UTC m=+1298.824195169" watchObservedRunningTime="2026-02-17 15:13:52.411199494 +0000 UTC m=+1298.827039970" Feb 17 15:13:52 crc kubenswrapper[4717]: E0217 15:13:52.867424 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 15:13:52 crc kubenswrapper[4717]: E0217 15:13:52.869969 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 15:13:52 crc kubenswrapper[4717]: E0217 15:13:52.871872 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 15:13:52 crc kubenswrapper[4717]: E0217 15:13:52.871937 4717 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6cb29e62-e185-4ccf-beb0-763a176ae07f" containerName="nova-scheduler-scheduler" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.185236 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.285977 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb29e62-e185-4ccf-beb0-763a176ae07f-config-data\") pod \"6cb29e62-e185-4ccf-beb0-763a176ae07f\" (UID: \"6cb29e62-e185-4ccf-beb0-763a176ae07f\") " Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.286334 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk5x2\" (UniqueName: \"kubernetes.io/projected/6cb29e62-e185-4ccf-beb0-763a176ae07f-kube-api-access-sk5x2\") pod \"6cb29e62-e185-4ccf-beb0-763a176ae07f\" (UID: \"6cb29e62-e185-4ccf-beb0-763a176ae07f\") " Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.286365 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb29e62-e185-4ccf-beb0-763a176ae07f-combined-ca-bundle\") pod \"6cb29e62-e185-4ccf-beb0-763a176ae07f\" (UID: \"6cb29e62-e185-4ccf-beb0-763a176ae07f\") " Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.303458 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cb29e62-e185-4ccf-beb0-763a176ae07f-kube-api-access-sk5x2" (OuterVolumeSpecName: "kube-api-access-sk5x2") pod "6cb29e62-e185-4ccf-beb0-763a176ae07f" (UID: "6cb29e62-e185-4ccf-beb0-763a176ae07f"). InnerVolumeSpecName "kube-api-access-sk5x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.317422 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cb29e62-e185-4ccf-beb0-763a176ae07f-config-data" (OuterVolumeSpecName: "config-data") pod "6cb29e62-e185-4ccf-beb0-763a176ae07f" (UID: "6cb29e62-e185-4ccf-beb0-763a176ae07f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.357691 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cb29e62-e185-4ccf-beb0-763a176ae07f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cb29e62-e185-4ccf-beb0-763a176ae07f" (UID: "6cb29e62-e185-4ccf-beb0-763a176ae07f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.388691 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb29e62-e185-4ccf-beb0-763a176ae07f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.388726 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk5x2\" (UniqueName: \"kubernetes.io/projected/6cb29e62-e185-4ccf-beb0-763a176ae07f-kube-api-access-sk5x2\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.388739 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb29e62-e185-4ccf-beb0-763a176ae07f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.409137 4717 generic.go:334] "Generic (PLEG): container finished" podID="6cb29e62-e185-4ccf-beb0-763a176ae07f" containerID="5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e" exitCode=0 Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.409184 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6cb29e62-e185-4ccf-beb0-763a176ae07f","Type":"ContainerDied","Data":"5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e"} Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.409231 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6cb29e62-e185-4ccf-beb0-763a176ae07f","Type":"ContainerDied","Data":"35babb2472901e195c66cb5c8c4444747dc2fc7f8fdad4a3a4f1c5211459c427"} Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.409250 4717 scope.go:117] "RemoveContainer" containerID="5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.409247 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.413988 4717 generic.go:334] "Generic (PLEG): container finished" podID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" containerID="671bfb3d4d4694d49a4c5cb5dd81ef5546c7bb3d76120ae971358f1d9eb14908" exitCode=0 Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.414042 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41","Type":"ContainerDied","Data":"671bfb3d4d4694d49a4c5cb5dd81ef5546c7bb3d76120ae971358f1d9eb14908"} Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.435725 4717 scope.go:117] "RemoveContainer" containerID="5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e" Feb 17 15:13:55 crc kubenswrapper[4717]: E0217 15:13:55.436136 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e\": container with ID starting with 5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e not found: ID does not exist" containerID="5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.436164 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e"} err="failed to get container status \"5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e\": rpc error: code = NotFound desc = could not find container \"5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e\": container with ID starting with 5ad2c675e5fa8f62b049570cff9e58a161ef97fa35d49e793fb3d876c561986e not found: ID does not exist" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.436227 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.485495 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.489594 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-logs\") pod \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.489979 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-combined-ca-bundle\") pod \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.490132 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cldb\" (UniqueName: \"kubernetes.io/projected/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-kube-api-access-5cldb\") pod \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.490257 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-config-data\") pod \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\" (UID: \"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41\") " Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.492575 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-logs" (OuterVolumeSpecName: "logs") pod "336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" (UID: "336126bd-b5e1-4870-8a9f-9d2b4a5c7e41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.494814 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-kube-api-access-5cldb" (OuterVolumeSpecName: "kube-api-access-5cldb") pod "336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" (UID: "336126bd-b5e1-4870-8a9f-9d2b4a5c7e41"). InnerVolumeSpecName "kube-api-access-5cldb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.500901 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.513106 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:13:55 crc kubenswrapper[4717]: E0217 15:13:55.513519 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb29e62-e185-4ccf-beb0-763a176ae07f" containerName="nova-scheduler-scheduler" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.513538 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb29e62-e185-4ccf-beb0-763a176ae07f" containerName="nova-scheduler-scheduler" Feb 17 15:13:55 crc kubenswrapper[4717]: E0217 15:13:55.513574 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" containerName="nova-api-api" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.513585 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" containerName="nova-api-api" Feb 17 15:13:55 crc kubenswrapper[4717]: E0217 15:13:55.513777 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" containerName="nova-api-log" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.516682 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" containerName="nova-api-log" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.517661 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cb29e62-e185-4ccf-beb0-763a176ae07f" containerName="nova-scheduler-scheduler" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.517711 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" containerName="nova-api-api" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.517737 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" containerName="nova-api-log" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.518536 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.521426 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" (UID: "336126bd-b5e1-4870-8a9f-9d2b4a5c7e41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.530130 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.533956 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-config-data" (OuterVolumeSpecName: "config-data") pod "336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" (UID: "336126bd-b5e1-4870-8a9f-9d2b4a5c7e41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.538496 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.593374 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxvfl\" (UniqueName: \"kubernetes.io/projected/d34708cb-eeda-449c-83ff-ea509cc7dbd1-kube-api-access-fxvfl\") pod \"nova-scheduler-0\" (UID: \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.593441 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34708cb-eeda-449c-83ff-ea509cc7dbd1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.593924 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34708cb-eeda-449c-83ff-ea509cc7dbd1-config-data\") pod \"nova-scheduler-0\" (UID: \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.594255 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.594327 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.594348 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cldb\" (UniqueName: \"kubernetes.io/projected/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-kube-api-access-5cldb\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.594360 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.696486 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34708cb-eeda-449c-83ff-ea509cc7dbd1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.697798 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34708cb-eeda-449c-83ff-ea509cc7dbd1-config-data\") pod \"nova-scheduler-0\" (UID: \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.697920 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxvfl\" (UniqueName: \"kubernetes.io/projected/d34708cb-eeda-449c-83ff-ea509cc7dbd1-kube-api-access-fxvfl\") pod \"nova-scheduler-0\" (UID: \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.702190 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34708cb-eeda-449c-83ff-ea509cc7dbd1-config-data\") pod \"nova-scheduler-0\" (UID: \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.703059 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34708cb-eeda-449c-83ff-ea509cc7dbd1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.717883 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxvfl\" (UniqueName: \"kubernetes.io/projected/d34708cb-eeda-449c-83ff-ea509cc7dbd1-kube-api-access-fxvfl\") pod \"nova-scheduler-0\" (UID: \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\") " pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.843963 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 15:13:55 crc kubenswrapper[4717]: I0217 15:13:55.861293 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cb29e62-e185-4ccf-beb0-763a176ae07f" path="/var/lib/kubelet/pods/6cb29e62-e185-4ccf-beb0-763a176ae07f/volumes" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.022530 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.022795 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.332800 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:13:56 crc kubenswrapper[4717]: W0217 15:13:56.348024 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd34708cb_eeda_449c_83ff_ea509cc7dbd1.slice/crio-d364e92cd1f65dd39a1088b54af8e18a6dcd8fdbd2fc59f1c2a0e4516733620d WatchSource:0}: Error finding container d364e92cd1f65dd39a1088b54af8e18a6dcd8fdbd2fc59f1c2a0e4516733620d: Status 404 returned error can't find the container with id d364e92cd1f65dd39a1088b54af8e18a6dcd8fdbd2fc59f1c2a0e4516733620d Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.424320 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"336126bd-b5e1-4870-8a9f-9d2b4a5c7e41","Type":"ContainerDied","Data":"e340b789a9cbe0e9b6587c69dffeb1a6b1d8f0d4e11dd218736e35a11f56f5eb"} Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.424358 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.424663 4717 scope.go:117] "RemoveContainer" containerID="671bfb3d4d4694d49a4c5cb5dd81ef5546c7bb3d76120ae971358f1d9eb14908" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.426936 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d34708cb-eeda-449c-83ff-ea509cc7dbd1","Type":"ContainerStarted","Data":"d364e92cd1f65dd39a1088b54af8e18a6dcd8fdbd2fc59f1c2a0e4516733620d"} Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.473495 4717 scope.go:117] "RemoveContainer" containerID="4a8943bb8a1f1503ea2d27b3cda586d2987796069eaec1f34a9c5b5b471b87c8" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.501334 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.528202 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.542233 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.544207 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.547848 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.550759 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.616709 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1070b40-313c-468b-af8f-6d322571152d-logs\") pod \"nova-api-0\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.616846 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4vgk\" (UniqueName: \"kubernetes.io/projected/a1070b40-313c-468b-af8f-6d322571152d-kube-api-access-n4vgk\") pod \"nova-api-0\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.616921 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1070b40-313c-468b-af8f-6d322571152d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.616995 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1070b40-313c-468b-af8f-6d322571152d-config-data\") pod \"nova-api-0\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.718711 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1070b40-313c-468b-af8f-6d322571152d-logs\") pod \"nova-api-0\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.718773 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4vgk\" (UniqueName: \"kubernetes.io/projected/a1070b40-313c-468b-af8f-6d322571152d-kube-api-access-n4vgk\") pod \"nova-api-0\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.718810 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1070b40-313c-468b-af8f-6d322571152d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.718839 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1070b40-313c-468b-af8f-6d322571152d-config-data\") pod \"nova-api-0\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.719525 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1070b40-313c-468b-af8f-6d322571152d-logs\") pod \"nova-api-0\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.723463 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1070b40-313c-468b-af8f-6d322571152d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.723849 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1070b40-313c-468b-af8f-6d322571152d-config-data\") pod \"nova-api-0\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.735226 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4vgk\" (UniqueName: \"kubernetes.io/projected/a1070b40-313c-468b-af8f-6d322571152d-kube-api-access-n4vgk\") pod \"nova-api-0\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " pod="openstack/nova-api-0" Feb 17 15:13:56 crc kubenswrapper[4717]: I0217 15:13:56.878255 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:13:57 crc kubenswrapper[4717]: I0217 15:13:57.348972 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:13:57 crc kubenswrapper[4717]: I0217 15:13:57.446013 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d34708cb-eeda-449c-83ff-ea509cc7dbd1","Type":"ContainerStarted","Data":"c2f167737850bb7a83afbea1f3313c44efcb7cef6d13a8e037b33ab648a7659b"} Feb 17 15:13:57 crc kubenswrapper[4717]: I0217 15:13:57.447694 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1070b40-313c-468b-af8f-6d322571152d","Type":"ContainerStarted","Data":"b5068eb7e9120bd35a32ceb1c4e556e5935736b4708629f5415bfbc5a115f705"} Feb 17 15:13:57 crc kubenswrapper[4717]: I0217 15:13:57.860954 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="336126bd-b5e1-4870-8a9f-9d2b4a5c7e41" path="/var/lib/kubelet/pods/336126bd-b5e1-4870-8a9f-9d2b4a5c7e41/volumes" Feb 17 15:13:58 crc kubenswrapper[4717]: I0217 15:13:58.355443 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 15:13:58 crc kubenswrapper[4717]: I0217 15:13:58.383554 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.383533975 podStartE2EDuration="3.383533975s" podCreationTimestamp="2026-02-17 15:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:13:57.464487856 +0000 UTC m=+1303.880328342" watchObservedRunningTime="2026-02-17 15:13:58.383533975 +0000 UTC m=+1304.799374451" Feb 17 15:13:58 crc kubenswrapper[4717]: I0217 15:13:58.480851 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1070b40-313c-468b-af8f-6d322571152d","Type":"ContainerStarted","Data":"e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d"} Feb 17 15:13:58 crc kubenswrapper[4717]: I0217 15:13:58.480908 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1070b40-313c-468b-af8f-6d322571152d","Type":"ContainerStarted","Data":"4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3"} Feb 17 15:13:58 crc kubenswrapper[4717]: I0217 15:13:58.505585 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.505567337 podStartE2EDuration="2.505567337s" podCreationTimestamp="2026-02-17 15:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:13:58.498827166 +0000 UTC m=+1304.914667652" watchObservedRunningTime="2026-02-17 15:13:58.505567337 +0000 UTC m=+1304.921407813" Feb 17 15:14:00 crc kubenswrapper[4717]: I0217 15:14:00.732130 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 17 15:14:00 crc kubenswrapper[4717]: I0217 15:14:00.845054 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 15:14:01 crc kubenswrapper[4717]: I0217 15:14:01.022210 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 15:14:01 crc kubenswrapper[4717]: I0217 15:14:01.022267 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 15:14:01 crc kubenswrapper[4717]: I0217 15:14:01.962316 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 15:14:01 crc kubenswrapper[4717]: I0217 15:14:01.962514 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e905bd78-7554-4636-b508-a2a67078018e" containerName="kube-state-metrics" containerID="cri-o://91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e" gracePeriod=30 Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.036296 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.036324 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.475698 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.521728 4717 generic.go:334] "Generic (PLEG): container finished" podID="e905bd78-7554-4636-b508-a2a67078018e" containerID="91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e" exitCode=2 Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.521791 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e905bd78-7554-4636-b508-a2a67078018e","Type":"ContainerDied","Data":"91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e"} Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.521817 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e905bd78-7554-4636-b508-a2a67078018e","Type":"ContainerDied","Data":"c192396115878c5165d2a70890fecfdb19fe679f7a6da71a920de31f001223e7"} Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.521833 4717 scope.go:117] "RemoveContainer" containerID="91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.522148 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.540916 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8j2j\" (UniqueName: \"kubernetes.io/projected/e905bd78-7554-4636-b508-a2a67078018e-kube-api-access-r8j2j\") pod \"e905bd78-7554-4636-b508-a2a67078018e\" (UID: \"e905bd78-7554-4636-b508-a2a67078018e\") " Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.553362 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e905bd78-7554-4636-b508-a2a67078018e-kube-api-access-r8j2j" (OuterVolumeSpecName: "kube-api-access-r8j2j") pod "e905bd78-7554-4636-b508-a2a67078018e" (UID: "e905bd78-7554-4636-b508-a2a67078018e"). InnerVolumeSpecName "kube-api-access-r8j2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.569283 4717 scope.go:117] "RemoveContainer" containerID="91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e" Feb 17 15:14:02 crc kubenswrapper[4717]: E0217 15:14:02.573230 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e\": container with ID starting with 91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e not found: ID does not exist" containerID="91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.573280 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e"} err="failed to get container status \"91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e\": rpc error: code = NotFound desc = could not find container \"91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e\": container with ID starting with 91d6f112d7b4ef4bd29455f40fd0c45c5838aa79449677a75b54ff3982b32b8e not found: ID does not exist" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.643429 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8j2j\" (UniqueName: \"kubernetes.io/projected/e905bd78-7554-4636-b508-a2a67078018e-kube-api-access-r8j2j\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.856660 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.864463 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.874521 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 15:14:02 crc kubenswrapper[4717]: E0217 15:14:02.874911 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e905bd78-7554-4636-b508-a2a67078018e" containerName="kube-state-metrics" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.874928 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e905bd78-7554-4636-b508-a2a67078018e" containerName="kube-state-metrics" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.875124 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e905bd78-7554-4636-b508-a2a67078018e" containerName="kube-state-metrics" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.875705 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.877515 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.880589 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.888707 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.949144 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10b56503-0d1b-474b-8b7c-0d07f5eae27b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"10b56503-0d1b-474b-8b7c-0d07f5eae27b\") " pod="openstack/kube-state-metrics-0" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.949466 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hszr9\" (UniqueName: \"kubernetes.io/projected/10b56503-0d1b-474b-8b7c-0d07f5eae27b-kube-api-access-hszr9\") pod \"kube-state-metrics-0\" (UID: \"10b56503-0d1b-474b-8b7c-0d07f5eae27b\") " pod="openstack/kube-state-metrics-0" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.949566 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b56503-0d1b-474b-8b7c-0d07f5eae27b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"10b56503-0d1b-474b-8b7c-0d07f5eae27b\") " pod="openstack/kube-state-metrics-0" Feb 17 15:14:02 crc kubenswrapper[4717]: I0217 15:14:02.949725 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10b56503-0d1b-474b-8b7c-0d07f5eae27b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"10b56503-0d1b-474b-8b7c-0d07f5eae27b\") " pod="openstack/kube-state-metrics-0" Feb 17 15:14:03 crc kubenswrapper[4717]: I0217 15:14:03.052264 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10b56503-0d1b-474b-8b7c-0d07f5eae27b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"10b56503-0d1b-474b-8b7c-0d07f5eae27b\") " pod="openstack/kube-state-metrics-0" Feb 17 15:14:03 crc kubenswrapper[4717]: I0217 15:14:03.053145 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10b56503-0d1b-474b-8b7c-0d07f5eae27b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"10b56503-0d1b-474b-8b7c-0d07f5eae27b\") " pod="openstack/kube-state-metrics-0" Feb 17 15:14:03 crc kubenswrapper[4717]: I0217 15:14:03.053331 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hszr9\" (UniqueName: \"kubernetes.io/projected/10b56503-0d1b-474b-8b7c-0d07f5eae27b-kube-api-access-hszr9\") pod \"kube-state-metrics-0\" (UID: \"10b56503-0d1b-474b-8b7c-0d07f5eae27b\") " pod="openstack/kube-state-metrics-0" Feb 17 15:14:03 crc kubenswrapper[4717]: I0217 15:14:03.053389 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b56503-0d1b-474b-8b7c-0d07f5eae27b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"10b56503-0d1b-474b-8b7c-0d07f5eae27b\") " pod="openstack/kube-state-metrics-0" Feb 17 15:14:03 crc kubenswrapper[4717]: I0217 15:14:03.065795 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10b56503-0d1b-474b-8b7c-0d07f5eae27b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"10b56503-0d1b-474b-8b7c-0d07f5eae27b\") " pod="openstack/kube-state-metrics-0" Feb 17 15:14:03 crc kubenswrapper[4717]: I0217 15:14:03.068772 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10b56503-0d1b-474b-8b7c-0d07f5eae27b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"10b56503-0d1b-474b-8b7c-0d07f5eae27b\") " pod="openstack/kube-state-metrics-0" Feb 17 15:14:03 crc kubenswrapper[4717]: I0217 15:14:03.069054 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b56503-0d1b-474b-8b7c-0d07f5eae27b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"10b56503-0d1b-474b-8b7c-0d07f5eae27b\") " pod="openstack/kube-state-metrics-0" Feb 17 15:14:03 crc kubenswrapper[4717]: I0217 15:14:03.073772 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hszr9\" (UniqueName: \"kubernetes.io/projected/10b56503-0d1b-474b-8b7c-0d07f5eae27b-kube-api-access-hszr9\") pod \"kube-state-metrics-0\" (UID: \"10b56503-0d1b-474b-8b7c-0d07f5eae27b\") " pod="openstack/kube-state-metrics-0" Feb 17 15:14:03 crc kubenswrapper[4717]: I0217 15:14:03.202547 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 15:14:03 crc kubenswrapper[4717]: I0217 15:14:03.684329 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 15:14:03 crc kubenswrapper[4717]: W0217 15:14:03.695919 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10b56503_0d1b_474b_8b7c_0d07f5eae27b.slice/crio-16f4bce7fc048cceb02d0ba4d0995009947350440d51257c66064894bc9e1f6a WatchSource:0}: Error finding container 16f4bce7fc048cceb02d0ba4d0995009947350440d51257c66064894bc9e1f6a: Status 404 returned error can't find the container with id 16f4bce7fc048cceb02d0ba4d0995009947350440d51257c66064894bc9e1f6a Feb 17 15:14:03 crc kubenswrapper[4717]: I0217 15:14:03.859843 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e905bd78-7554-4636-b508-a2a67078018e" path="/var/lib/kubelet/pods/e905bd78-7554-4636-b508-a2a67078018e/volumes" Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.203644 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.204491 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="ceilometer-central-agent" containerID="cri-o://1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be" gracePeriod=30 Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.204760 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="proxy-httpd" containerID="cri-o://13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb" gracePeriod=30 Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.204787 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="ceilometer-notification-agent" containerID="cri-o://4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6" gracePeriod=30 Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.204836 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="sg-core" containerID="cri-o://43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0" gracePeriod=30 Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.546283 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"10b56503-0d1b-474b-8b7c-0d07f5eae27b","Type":"ContainerStarted","Data":"dc0a54a8b53e47bca9ef19adc7c386bd09a90e81161485032a6b6736998ce96f"} Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.546326 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"10b56503-0d1b-474b-8b7c-0d07f5eae27b","Type":"ContainerStarted","Data":"16f4bce7fc048cceb02d0ba4d0995009947350440d51257c66064894bc9e1f6a"} Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.546365 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.548682 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerID="13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb" exitCode=0 Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.548804 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerID="43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0" exitCode=2 Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.548756 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cb760d7-52d5-42d6-9139-f87799aa88b9","Type":"ContainerDied","Data":"13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb"} Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.548917 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cb760d7-52d5-42d6-9139-f87799aa88b9","Type":"ContainerDied","Data":"43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0"} Feb 17 15:14:04 crc kubenswrapper[4717]: I0217 15:14:04.571718 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.148327367 podStartE2EDuration="2.571700031s" podCreationTimestamp="2026-02-17 15:14:02 +0000 UTC" firstStartedPulling="2026-02-17 15:14:03.70017233 +0000 UTC m=+1310.116012826" lastFinishedPulling="2026-02-17 15:14:04.123545014 +0000 UTC m=+1310.539385490" observedRunningTime="2026-02-17 15:14:04.569451357 +0000 UTC m=+1310.985291843" watchObservedRunningTime="2026-02-17 15:14:04.571700031 +0000 UTC m=+1310.987540507" Feb 17 15:14:05 crc kubenswrapper[4717]: I0217 15:14:05.564243 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerID="1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be" exitCode=0 Feb 17 15:14:05 crc kubenswrapper[4717]: I0217 15:14:05.565201 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cb760d7-52d5-42d6-9139-f87799aa88b9","Type":"ContainerDied","Data":"1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be"} Feb 17 15:14:05 crc kubenswrapper[4717]: I0217 15:14:05.844392 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 15:14:05 crc kubenswrapper[4717]: I0217 15:14:05.874821 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 15:14:06 crc kubenswrapper[4717]: I0217 15:14:06.604405 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 15:14:06 crc kubenswrapper[4717]: I0217 15:14:06.878792 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 15:14:06 crc kubenswrapper[4717]: I0217 15:14:06.878845 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 15:14:07 crc kubenswrapper[4717]: I0217 15:14:07.962618 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a1070b40-313c-468b-af8f-6d322571152d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 15:14:07 crc kubenswrapper[4717]: I0217 15:14:07.962642 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a1070b40-313c-468b-af8f-6d322571152d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.183149 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.265844 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-scripts\") pod \"7cb760d7-52d5-42d6-9139-f87799aa88b9\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.265905 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb760d7-52d5-42d6-9139-f87799aa88b9-log-httpd\") pod \"7cb760d7-52d5-42d6-9139-f87799aa88b9\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.265958 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-combined-ca-bundle\") pod \"7cb760d7-52d5-42d6-9139-f87799aa88b9\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.266007 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-config-data\") pod \"7cb760d7-52d5-42d6-9139-f87799aa88b9\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.266059 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb760d7-52d5-42d6-9139-f87799aa88b9-run-httpd\") pod \"7cb760d7-52d5-42d6-9139-f87799aa88b9\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.266214 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-sg-core-conf-yaml\") pod \"7cb760d7-52d5-42d6-9139-f87799aa88b9\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.266244 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct274\" (UniqueName: \"kubernetes.io/projected/7cb760d7-52d5-42d6-9139-f87799aa88b9-kube-api-access-ct274\") pod \"7cb760d7-52d5-42d6-9139-f87799aa88b9\" (UID: \"7cb760d7-52d5-42d6-9139-f87799aa88b9\") " Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.266469 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cb760d7-52d5-42d6-9139-f87799aa88b9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7cb760d7-52d5-42d6-9139-f87799aa88b9" (UID: "7cb760d7-52d5-42d6-9139-f87799aa88b9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.266721 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb760d7-52d5-42d6-9139-f87799aa88b9-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.266887 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cb760d7-52d5-42d6-9139-f87799aa88b9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7cb760d7-52d5-42d6-9139-f87799aa88b9" (UID: "7cb760d7-52d5-42d6-9139-f87799aa88b9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.296675 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-scripts" (OuterVolumeSpecName: "scripts") pod "7cb760d7-52d5-42d6-9139-f87799aa88b9" (UID: "7cb760d7-52d5-42d6-9139-f87799aa88b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.302313 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb760d7-52d5-42d6-9139-f87799aa88b9-kube-api-access-ct274" (OuterVolumeSpecName: "kube-api-access-ct274") pod "7cb760d7-52d5-42d6-9139-f87799aa88b9" (UID: "7cb760d7-52d5-42d6-9139-f87799aa88b9"). InnerVolumeSpecName "kube-api-access-ct274". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.311982 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7cb760d7-52d5-42d6-9139-f87799aa88b9" (UID: "7cb760d7-52d5-42d6-9139-f87799aa88b9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.368056 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.368328 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct274\" (UniqueName: \"kubernetes.io/projected/7cb760d7-52d5-42d6-9139-f87799aa88b9-kube-api-access-ct274\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.368879 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.368971 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb760d7-52d5-42d6-9139-f87799aa88b9-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.391234 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cb760d7-52d5-42d6-9139-f87799aa88b9" (UID: "7cb760d7-52d5-42d6-9139-f87799aa88b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.411776 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-config-data" (OuterVolumeSpecName: "config-data") pod "7cb760d7-52d5-42d6-9139-f87799aa88b9" (UID: "7cb760d7-52d5-42d6-9139-f87799aa88b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.470285 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.470337 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb760d7-52d5-42d6-9139-f87799aa88b9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.593236 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerID="4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6" exitCode=0 Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.593280 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cb760d7-52d5-42d6-9139-f87799aa88b9","Type":"ContainerDied","Data":"4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6"} Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.593291 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.593311 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7cb760d7-52d5-42d6-9139-f87799aa88b9","Type":"ContainerDied","Data":"ac459812fa123f127ba2b13703eafee5469f551351db00806adf895971e35456"} Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.593331 4717 scope.go:117] "RemoveContainer" containerID="13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.633909 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.638046 4717 scope.go:117] "RemoveContainer" containerID="43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.648133 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.685133 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:08 crc kubenswrapper[4717]: E0217 15:14:08.685706 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="ceilometer-central-agent" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.685725 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="ceilometer-central-agent" Feb 17 15:14:08 crc kubenswrapper[4717]: E0217 15:14:08.685746 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="proxy-httpd" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.685753 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="proxy-httpd" Feb 17 15:14:08 crc kubenswrapper[4717]: E0217 15:14:08.685814 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="sg-core" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.685823 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="sg-core" Feb 17 15:14:08 crc kubenswrapper[4717]: E0217 15:14:08.685843 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="ceilometer-notification-agent" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.685850 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="ceilometer-notification-agent" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.686184 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="proxy-httpd" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.686205 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="sg-core" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.686222 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="ceilometer-notification-agent" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.686236 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" containerName="ceilometer-central-agent" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.689854 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.691571 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.691925 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.695369 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.699607 4717 scope.go:117] "RemoveContainer" containerID="4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.711070 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.730551 4717 scope.go:117] "RemoveContainer" containerID="1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.754910 4717 scope.go:117] "RemoveContainer" containerID="13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb" Feb 17 15:14:08 crc kubenswrapper[4717]: E0217 15:14:08.755251 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb\": container with ID starting with 13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb not found: ID does not exist" containerID="13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.755274 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb"} err="failed to get container status \"13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb\": rpc error: code = NotFound desc = could not find container \"13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb\": container with ID starting with 13a7d9aff868ea7834a6ab38926a8b37446dfbada6c2303aef1adaee522355bb not found: ID does not exist" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.755296 4717 scope.go:117] "RemoveContainer" containerID="43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0" Feb 17 15:14:08 crc kubenswrapper[4717]: E0217 15:14:08.755540 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0\": container with ID starting with 43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0 not found: ID does not exist" containerID="43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.755563 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0"} err="failed to get container status \"43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0\": rpc error: code = NotFound desc = could not find container \"43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0\": container with ID starting with 43d2238f3e6cdf1b90eebb23e750d0fd4bac53aeba053513f3a46442d85c01b0 not found: ID does not exist" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.755577 4717 scope.go:117] "RemoveContainer" containerID="4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6" Feb 17 15:14:08 crc kubenswrapper[4717]: E0217 15:14:08.755764 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6\": container with ID starting with 4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6 not found: ID does not exist" containerID="4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.755784 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6"} err="failed to get container status \"4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6\": rpc error: code = NotFound desc = could not find container \"4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6\": container with ID starting with 4a609e05de6d4977088c509238ee4fb7b01558b44ae3aa0e22b148e929efa5c6 not found: ID does not exist" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.755794 4717 scope.go:117] "RemoveContainer" containerID="1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be" Feb 17 15:14:08 crc kubenswrapper[4717]: E0217 15:14:08.756000 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be\": container with ID starting with 1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be not found: ID does not exist" containerID="1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.756022 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be"} err="failed to get container status \"1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be\": rpc error: code = NotFound desc = could not find container \"1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be\": container with ID starting with 1700ff91fae1d2802ff0e87c89334a8d4634266a8e4a21d5013019d525a972be not found: ID does not exist" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.773785 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.773856 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95803b30-826c-493c-8761-10cc156b4c4b-run-httpd\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.773938 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.774006 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-config-data\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.774096 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnblx\" (UniqueName: \"kubernetes.io/projected/95803b30-826c-493c-8761-10cc156b4c4b-kube-api-access-jnblx\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.774149 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-scripts\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.774405 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95803b30-826c-493c-8761-10cc156b4c4b-log-httpd\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.774470 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.875764 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.875804 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95803b30-826c-493c-8761-10cc156b4c4b-run-httpd\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.875838 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.875896 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-config-data\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.875927 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnblx\" (UniqueName: \"kubernetes.io/projected/95803b30-826c-493c-8761-10cc156b4c4b-kube-api-access-jnblx\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.875948 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-scripts\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.875989 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95803b30-826c-493c-8761-10cc156b4c4b-log-httpd\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.876002 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.878857 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95803b30-826c-493c-8761-10cc156b4c4b-run-httpd\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.882525 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-scripts\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.882681 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.883046 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95803b30-826c-493c-8761-10cc156b4c4b-log-httpd\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.884436 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-config-data\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.888046 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.898699 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:08 crc kubenswrapper[4717]: I0217 15:14:08.902189 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnblx\" (UniqueName: \"kubernetes.io/projected/95803b30-826c-493c-8761-10cc156b4c4b-kube-api-access-jnblx\") pod \"ceilometer-0\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " pod="openstack/ceilometer-0" Feb 17 15:14:09 crc kubenswrapper[4717]: I0217 15:14:09.018290 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:14:09 crc kubenswrapper[4717]: I0217 15:14:09.629227 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:09 crc kubenswrapper[4717]: W0217 15:14:09.642821 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95803b30_826c_493c_8761_10cc156b4c4b.slice/crio-f688367d4c5f1c1cd48e081ecb2be958032bcde45203929a93b0adeedee20d9e WatchSource:0}: Error finding container f688367d4c5f1c1cd48e081ecb2be958032bcde45203929a93b0adeedee20d9e: Status 404 returned error can't find the container with id f688367d4c5f1c1cd48e081ecb2be958032bcde45203929a93b0adeedee20d9e Feb 17 15:14:09 crc kubenswrapper[4717]: I0217 15:14:09.861025 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb760d7-52d5-42d6-9139-f87799aa88b9" path="/var/lib/kubelet/pods/7cb760d7-52d5-42d6-9139-f87799aa88b9/volumes" Feb 17 15:14:10 crc kubenswrapper[4717]: I0217 15:14:10.615296 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95803b30-826c-493c-8761-10cc156b4c4b","Type":"ContainerStarted","Data":"52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4"} Feb 17 15:14:10 crc kubenswrapper[4717]: I0217 15:14:10.615635 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95803b30-826c-493c-8761-10cc156b4c4b","Type":"ContainerStarted","Data":"f688367d4c5f1c1cd48e081ecb2be958032bcde45203929a93b0adeedee20d9e"} Feb 17 15:14:11 crc kubenswrapper[4717]: I0217 15:14:11.025756 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 15:14:11 crc kubenswrapper[4717]: I0217 15:14:11.029525 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 15:14:11 crc kubenswrapper[4717]: I0217 15:14:11.032475 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 15:14:11 crc kubenswrapper[4717]: I0217 15:14:11.625589 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95803b30-826c-493c-8761-10cc156b4c4b","Type":"ContainerStarted","Data":"372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15"} Feb 17 15:14:11 crc kubenswrapper[4717]: I0217 15:14:11.626108 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95803b30-826c-493c-8761-10cc156b4c4b","Type":"ContainerStarted","Data":"f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e"} Feb 17 15:14:11 crc kubenswrapper[4717]: I0217 15:14:11.630026 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.235407 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.452991 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.565738 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21a925c-a968-4b57-8e0c-941b4a403a2b-config-data\") pod \"e21a925c-a968-4b57-8e0c-941b4a403a2b\" (UID: \"e21a925c-a968-4b57-8e0c-941b4a403a2b\") " Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.565803 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkz8f\" (UniqueName: \"kubernetes.io/projected/e21a925c-a968-4b57-8e0c-941b4a403a2b-kube-api-access-tkz8f\") pod \"e21a925c-a968-4b57-8e0c-941b4a403a2b\" (UID: \"e21a925c-a968-4b57-8e0c-941b4a403a2b\") " Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.566007 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21a925c-a968-4b57-8e0c-941b4a403a2b-combined-ca-bundle\") pod \"e21a925c-a968-4b57-8e0c-941b4a403a2b\" (UID: \"e21a925c-a968-4b57-8e0c-941b4a403a2b\") " Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.572375 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e21a925c-a968-4b57-8e0c-941b4a403a2b-kube-api-access-tkz8f" (OuterVolumeSpecName: "kube-api-access-tkz8f") pod "e21a925c-a968-4b57-8e0c-941b4a403a2b" (UID: "e21a925c-a968-4b57-8e0c-941b4a403a2b"). InnerVolumeSpecName "kube-api-access-tkz8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.591433 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21a925c-a968-4b57-8e0c-941b4a403a2b-config-data" (OuterVolumeSpecName: "config-data") pod "e21a925c-a968-4b57-8e0c-941b4a403a2b" (UID: "e21a925c-a968-4b57-8e0c-941b4a403a2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.602747 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21a925c-a968-4b57-8e0c-941b4a403a2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e21a925c-a968-4b57-8e0c-941b4a403a2b" (UID: "e21a925c-a968-4b57-8e0c-941b4a403a2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.642887 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95803b30-826c-493c-8761-10cc156b4c4b","Type":"ContainerStarted","Data":"d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7"} Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.642965 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.644917 4717 generic.go:334] "Generic (PLEG): container finished" podID="e21a925c-a968-4b57-8e0c-941b4a403a2b" containerID="75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930" exitCode=137 Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.644945 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.644984 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e21a925c-a968-4b57-8e0c-941b4a403a2b","Type":"ContainerDied","Data":"75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930"} Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.645001 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e21a925c-a968-4b57-8e0c-941b4a403a2b","Type":"ContainerDied","Data":"c824311c97c6194f6d2e114954006d7c2234e7f8626f4a4a4f87863d32fa4cd8"} Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.645017 4717 scope.go:117] "RemoveContainer" containerID="75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.671268 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.474811958 podStartE2EDuration="5.67125348s" podCreationTimestamp="2026-02-17 15:14:08 +0000 UTC" firstStartedPulling="2026-02-17 15:14:09.645604248 +0000 UTC m=+1316.061444724" lastFinishedPulling="2026-02-17 15:14:12.84204577 +0000 UTC m=+1319.257886246" observedRunningTime="2026-02-17 15:14:13.668152912 +0000 UTC m=+1320.083993398" watchObservedRunningTime="2026-02-17 15:14:13.67125348 +0000 UTC m=+1320.087093956" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.678236 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21a925c-a968-4b57-8e0c-941b4a403a2b-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.678268 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkz8f\" (UniqueName: \"kubernetes.io/projected/e21a925c-a968-4b57-8e0c-941b4a403a2b-kube-api-access-tkz8f\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.678278 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21a925c-a968-4b57-8e0c-941b4a403a2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.694126 4717 scope.go:117] "RemoveContainer" containerID="75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930" Feb 17 15:14:13 crc kubenswrapper[4717]: E0217 15:14:13.694953 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930\": container with ID starting with 75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930 not found: ID does not exist" containerID="75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.694993 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930"} err="failed to get container status \"75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930\": rpc error: code = NotFound desc = could not find container \"75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930\": container with ID starting with 75c79e8b31b2f0f867834fe8a030d48d0294104f8580136c14ffb85bc92b3930 not found: ID does not exist" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.705509 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.716333 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.732695 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 15:14:13 crc kubenswrapper[4717]: E0217 15:14:13.733346 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21a925c-a968-4b57-8e0c-941b4a403a2b" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.733365 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21a925c-a968-4b57-8e0c-941b4a403a2b" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.733591 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e21a925c-a968-4b57-8e0c-941b4a403a2b" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.734400 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.737840 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.738040 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.738205 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.747305 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.780408 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d942b5-ae77-4210-b456-ca573622fc06-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.780525 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sldf\" (UniqueName: \"kubernetes.io/projected/90d942b5-ae77-4210-b456-ca573622fc06-kube-api-access-7sldf\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.780561 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90d942b5-ae77-4210-b456-ca573622fc06-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.780623 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90d942b5-ae77-4210-b456-ca573622fc06-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.780772 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d942b5-ae77-4210-b456-ca573622fc06-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.867362 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e21a925c-a968-4b57-8e0c-941b4a403a2b" path="/var/lib/kubelet/pods/e21a925c-a968-4b57-8e0c-941b4a403a2b/volumes" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.883116 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d942b5-ae77-4210-b456-ca573622fc06-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.883264 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sldf\" (UniqueName: \"kubernetes.io/projected/90d942b5-ae77-4210-b456-ca573622fc06-kube-api-access-7sldf\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.883334 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90d942b5-ae77-4210-b456-ca573622fc06-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.883378 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90d942b5-ae77-4210-b456-ca573622fc06-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.883539 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d942b5-ae77-4210-b456-ca573622fc06-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.887999 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d942b5-ae77-4210-b456-ca573622fc06-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.888427 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d942b5-ae77-4210-b456-ca573622fc06-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.888689 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90d942b5-ae77-4210-b456-ca573622fc06-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.889558 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90d942b5-ae77-4210-b456-ca573622fc06-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:13 crc kubenswrapper[4717]: I0217 15:14:13.901628 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sldf\" (UniqueName: \"kubernetes.io/projected/90d942b5-ae77-4210-b456-ca573622fc06-kube-api-access-7sldf\") pod \"nova-cell1-novncproxy-0\" (UID: \"90d942b5-ae77-4210-b456-ca573622fc06\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:14 crc kubenswrapper[4717]: I0217 15:14:14.063373 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:14 crc kubenswrapper[4717]: I0217 15:14:14.541271 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 15:14:14 crc kubenswrapper[4717]: W0217 15:14:14.548174 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90d942b5_ae77_4210_b456_ca573622fc06.slice/crio-21b360456800feb391ed9be1a55cb971d23c5394c269d216ec3f3fcbbc8d7df4 WatchSource:0}: Error finding container 21b360456800feb391ed9be1a55cb971d23c5394c269d216ec3f3fcbbc8d7df4: Status 404 returned error can't find the container with id 21b360456800feb391ed9be1a55cb971d23c5394c269d216ec3f3fcbbc8d7df4 Feb 17 15:14:14 crc kubenswrapper[4717]: I0217 15:14:14.655583 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"90d942b5-ae77-4210-b456-ca573622fc06","Type":"ContainerStarted","Data":"21b360456800feb391ed9be1a55cb971d23c5394c269d216ec3f3fcbbc8d7df4"} Feb 17 15:14:15 crc kubenswrapper[4717]: I0217 15:14:15.667535 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"90d942b5-ae77-4210-b456-ca573622fc06","Type":"ContainerStarted","Data":"982962b73f26931917690fcfd115a5ea54857ee7c993f97289a4297ffe955e40"} Feb 17 15:14:15 crc kubenswrapper[4717]: I0217 15:14:15.701817 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.701797919 podStartE2EDuration="2.701797919s" podCreationTimestamp="2026-02-17 15:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:14:15.692020961 +0000 UTC m=+1322.107861447" watchObservedRunningTime="2026-02-17 15:14:15.701797919 +0000 UTC m=+1322.117638385" Feb 17 15:14:16 crc kubenswrapper[4717]: I0217 15:14:16.882360 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 15:14:16 crc kubenswrapper[4717]: I0217 15:14:16.883004 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 15:14:16 crc kubenswrapper[4717]: I0217 15:14:16.884207 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 15:14:16 crc kubenswrapper[4717]: I0217 15:14:16.885735 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 15:14:17 crc kubenswrapper[4717]: I0217 15:14:17.687847 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 15:14:17 crc kubenswrapper[4717]: I0217 15:14:17.691938 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 15:14:17 crc kubenswrapper[4717]: I0217 15:14:17.901406 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-g7ft8"] Feb 17 15:14:17 crc kubenswrapper[4717]: I0217 15:14:17.903097 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:17 crc kubenswrapper[4717]: I0217 15:14:17.913013 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-g7ft8"] Feb 17 15:14:17 crc kubenswrapper[4717]: I0217 15:14:17.968380 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:17 crc kubenswrapper[4717]: I0217 15:14:17.968475 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:17 crc kubenswrapper[4717]: I0217 15:14:17.968518 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwm2r\" (UniqueName: \"kubernetes.io/projected/6f5488fe-614a-4ff6-bb53-f1578e913bdd-kube-api-access-nwm2r\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:17 crc kubenswrapper[4717]: I0217 15:14:17.968572 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:17 crc kubenswrapper[4717]: I0217 15:14:17.968668 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-config\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:17 crc kubenswrapper[4717]: I0217 15:14:17.968736 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.070555 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.070627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwm2r\" (UniqueName: \"kubernetes.io/projected/6f5488fe-614a-4ff6-bb53-f1578e913bdd-kube-api-access-nwm2r\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.070681 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.070746 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-config\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.070787 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.070846 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.071988 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.071987 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.072715 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-config\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.072967 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.073462 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.095144 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwm2r\" (UniqueName: \"kubernetes.io/projected/6f5488fe-614a-4ff6-bb53-f1578e913bdd-kube-api-access-nwm2r\") pod \"dnsmasq-dns-59cf4bdb65-g7ft8\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.225037 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:18 crc kubenswrapper[4717]: I0217 15:14:18.710480 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-g7ft8"] Feb 17 15:14:18 crc kubenswrapper[4717]: W0217 15:14:18.713431 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f5488fe_614a_4ff6_bb53_f1578e913bdd.slice/crio-deeba058bb9c72329c64137b8cb827cfdbef3393433a88ca0b3e29a624f81b23 WatchSource:0}: Error finding container deeba058bb9c72329c64137b8cb827cfdbef3393433a88ca0b3e29a624f81b23: Status 404 returned error can't find the container with id deeba058bb9c72329c64137b8cb827cfdbef3393433a88ca0b3e29a624f81b23 Feb 17 15:14:19 crc kubenswrapper[4717]: I0217 15:14:19.063640 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:19 crc kubenswrapper[4717]: E0217 15:14:19.155587 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f5488fe_614a_4ff6_bb53_f1578e913bdd.slice/crio-9fbc9a2ed0edfe50b3b9f4f79ef6cdfaf7dfa03de1cc7d9b28544060facd2819.scope\": RecentStats: unable to find data in memory cache]" Feb 17 15:14:19 crc kubenswrapper[4717]: I0217 15:14:19.709901 4717 generic.go:334] "Generic (PLEG): container finished" podID="6f5488fe-614a-4ff6-bb53-f1578e913bdd" containerID="9fbc9a2ed0edfe50b3b9f4f79ef6cdfaf7dfa03de1cc7d9b28544060facd2819" exitCode=0 Feb 17 15:14:19 crc kubenswrapper[4717]: I0217 15:14:19.710003 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" event={"ID":"6f5488fe-614a-4ff6-bb53-f1578e913bdd","Type":"ContainerDied","Data":"9fbc9a2ed0edfe50b3b9f4f79ef6cdfaf7dfa03de1cc7d9b28544060facd2819"} Feb 17 15:14:19 crc kubenswrapper[4717]: I0217 15:14:19.710418 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" event={"ID":"6f5488fe-614a-4ff6-bb53-f1578e913bdd","Type":"ContainerStarted","Data":"deeba058bb9c72329c64137b8cb827cfdbef3393433a88ca0b3e29a624f81b23"} Feb 17 15:14:19 crc kubenswrapper[4717]: I0217 15:14:19.758244 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:19 crc kubenswrapper[4717]: I0217 15:14:19.758639 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="ceilometer-central-agent" containerID="cri-o://52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4" gracePeriod=30 Feb 17 15:14:19 crc kubenswrapper[4717]: I0217 15:14:19.758900 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="proxy-httpd" containerID="cri-o://d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7" gracePeriod=30 Feb 17 15:14:19 crc kubenswrapper[4717]: I0217 15:14:19.758979 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="ceilometer-notification-agent" containerID="cri-o://f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e" gracePeriod=30 Feb 17 15:14:19 crc kubenswrapper[4717]: I0217 15:14:19.759110 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="sg-core" containerID="cri-o://372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15" gracePeriod=30 Feb 17 15:14:20 crc kubenswrapper[4717]: I0217 15:14:20.306837 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:14:20 crc kubenswrapper[4717]: I0217 15:14:20.722774 4717 generic.go:334] "Generic (PLEG): container finished" podID="95803b30-826c-493c-8761-10cc156b4c4b" containerID="d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7" exitCode=0 Feb 17 15:14:20 crc kubenswrapper[4717]: I0217 15:14:20.723214 4717 generic.go:334] "Generic (PLEG): container finished" podID="95803b30-826c-493c-8761-10cc156b4c4b" containerID="372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15" exitCode=2 Feb 17 15:14:20 crc kubenswrapper[4717]: I0217 15:14:20.723229 4717 generic.go:334] "Generic (PLEG): container finished" podID="95803b30-826c-493c-8761-10cc156b4c4b" containerID="52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4" exitCode=0 Feb 17 15:14:20 crc kubenswrapper[4717]: I0217 15:14:20.722886 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95803b30-826c-493c-8761-10cc156b4c4b","Type":"ContainerDied","Data":"d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7"} Feb 17 15:14:20 crc kubenswrapper[4717]: I0217 15:14:20.723298 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95803b30-826c-493c-8761-10cc156b4c4b","Type":"ContainerDied","Data":"372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15"} Feb 17 15:14:20 crc kubenswrapper[4717]: I0217 15:14:20.723313 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95803b30-826c-493c-8761-10cc156b4c4b","Type":"ContainerDied","Data":"52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4"} Feb 17 15:14:20 crc kubenswrapper[4717]: I0217 15:14:20.725541 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" event={"ID":"6f5488fe-614a-4ff6-bb53-f1578e913bdd","Type":"ContainerStarted","Data":"6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed"} Feb 17 15:14:20 crc kubenswrapper[4717]: I0217 15:14:20.725687 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a1070b40-313c-468b-af8f-6d322571152d" containerName="nova-api-log" containerID="cri-o://4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3" gracePeriod=30 Feb 17 15:14:20 crc kubenswrapper[4717]: I0217 15:14:20.725762 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a1070b40-313c-468b-af8f-6d322571152d" containerName="nova-api-api" containerID="cri-o://e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d" gracePeriod=30 Feb 17 15:14:20 crc kubenswrapper[4717]: I0217 15:14:20.751736 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" podStartSLOduration=3.751718426 podStartE2EDuration="3.751718426s" podCreationTimestamp="2026-02-17 15:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:14:20.751459369 +0000 UTC m=+1327.167299865" watchObservedRunningTime="2026-02-17 15:14:20.751718426 +0000 UTC m=+1327.167558902" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.420786 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.475063 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-sg-core-conf-yaml\") pod \"95803b30-826c-493c-8761-10cc156b4c4b\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.475187 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95803b30-826c-493c-8761-10cc156b4c4b-log-httpd\") pod \"95803b30-826c-493c-8761-10cc156b4c4b\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.475238 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-config-data\") pod \"95803b30-826c-493c-8761-10cc156b4c4b\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.475272 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-ceilometer-tls-certs\") pod \"95803b30-826c-493c-8761-10cc156b4c4b\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.475342 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnblx\" (UniqueName: \"kubernetes.io/projected/95803b30-826c-493c-8761-10cc156b4c4b-kube-api-access-jnblx\") pod \"95803b30-826c-493c-8761-10cc156b4c4b\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.475393 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-combined-ca-bundle\") pod \"95803b30-826c-493c-8761-10cc156b4c4b\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.475481 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-scripts\") pod \"95803b30-826c-493c-8761-10cc156b4c4b\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.475536 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95803b30-826c-493c-8761-10cc156b4c4b-run-httpd\") pod \"95803b30-826c-493c-8761-10cc156b4c4b\" (UID: \"95803b30-826c-493c-8761-10cc156b4c4b\") " Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.475825 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95803b30-826c-493c-8761-10cc156b4c4b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "95803b30-826c-493c-8761-10cc156b4c4b" (UID: "95803b30-826c-493c-8761-10cc156b4c4b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.476052 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95803b30-826c-493c-8761-10cc156b4c4b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "95803b30-826c-493c-8761-10cc156b4c4b" (UID: "95803b30-826c-493c-8761-10cc156b4c4b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.476216 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95803b30-826c-493c-8761-10cc156b4c4b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.492136 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95803b30-826c-493c-8761-10cc156b4c4b-kube-api-access-jnblx" (OuterVolumeSpecName: "kube-api-access-jnblx") pod "95803b30-826c-493c-8761-10cc156b4c4b" (UID: "95803b30-826c-493c-8761-10cc156b4c4b"). InnerVolumeSpecName "kube-api-access-jnblx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.496648 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-scripts" (OuterVolumeSpecName: "scripts") pod "95803b30-826c-493c-8761-10cc156b4c4b" (UID: "95803b30-826c-493c-8761-10cc156b4c4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.542567 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "95803b30-826c-493c-8761-10cc156b4c4b" (UID: "95803b30-826c-493c-8761-10cc156b4c4b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.578527 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.578606 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95803b30-826c-493c-8761-10cc156b4c4b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.578642 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.578660 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnblx\" (UniqueName: \"kubernetes.io/projected/95803b30-826c-493c-8761-10cc156b4c4b-kube-api-access-jnblx\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.604580 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "95803b30-826c-493c-8761-10cc156b4c4b" (UID: "95803b30-826c-493c-8761-10cc156b4c4b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.613863 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95803b30-826c-493c-8761-10cc156b4c4b" (UID: "95803b30-826c-493c-8761-10cc156b4c4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.669515 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-config-data" (OuterVolumeSpecName: "config-data") pod "95803b30-826c-493c-8761-10cc156b4c4b" (UID: "95803b30-826c-493c-8761-10cc156b4c4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.680964 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.681004 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.681020 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95803b30-826c-493c-8761-10cc156b4c4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.736057 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.736127 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95803b30-826c-493c-8761-10cc156b4c4b","Type":"ContainerDied","Data":"f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e"} Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.736193 4717 scope.go:117] "RemoveContainer" containerID="d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.735970 4717 generic.go:334] "Generic (PLEG): container finished" podID="95803b30-826c-493c-8761-10cc156b4c4b" containerID="f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e" exitCode=0 Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.736475 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95803b30-826c-493c-8761-10cc156b4c4b","Type":"ContainerDied","Data":"f688367d4c5f1c1cd48e081ecb2be958032bcde45203929a93b0adeedee20d9e"} Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.738280 4717 generic.go:334] "Generic (PLEG): container finished" podID="a1070b40-313c-468b-af8f-6d322571152d" containerID="4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3" exitCode=143 Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.738806 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1070b40-313c-468b-af8f-6d322571152d","Type":"ContainerDied","Data":"4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3"} Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.738835 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.759214 4717 scope.go:117] "RemoveContainer" containerID="372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.775259 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.784048 4717 scope.go:117] "RemoveContainer" containerID="f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.806782 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.816965 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:21 crc kubenswrapper[4717]: E0217 15:14:21.817723 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="ceilometer-notification-agent" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.817750 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="ceilometer-notification-agent" Feb 17 15:14:21 crc kubenswrapper[4717]: E0217 15:14:21.817775 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="sg-core" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.817784 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="sg-core" Feb 17 15:14:21 crc kubenswrapper[4717]: E0217 15:14:21.817808 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="proxy-httpd" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.817814 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="proxy-httpd" Feb 17 15:14:21 crc kubenswrapper[4717]: E0217 15:14:21.817834 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="ceilometer-central-agent" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.817841 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="ceilometer-central-agent" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.818231 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="ceilometer-notification-agent" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.818270 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="proxy-httpd" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.818314 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="ceilometer-central-agent" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.818329 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="95803b30-826c-493c-8761-10cc156b4c4b" containerName="sg-core" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.839895 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.840015 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.842032 4717 scope.go:117] "RemoveContainer" containerID="52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.843614 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.843743 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.843834 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.860151 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95803b30-826c-493c-8761-10cc156b4c4b" path="/var/lib/kubelet/pods/95803b30-826c-493c-8761-10cc156b4c4b/volumes" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.883332 4717 scope.go:117] "RemoveContainer" containerID="d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7" Feb 17 15:14:21 crc kubenswrapper[4717]: E0217 15:14:21.883956 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7\": container with ID starting with d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7 not found: ID does not exist" containerID="d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.883990 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7"} err="failed to get container status \"d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7\": rpc error: code = NotFound desc = could not find container \"d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7\": container with ID starting with d58fb4443d2b7ca780894f34f46b272932b0c371d34e639d2d09382b17f0a4f7 not found: ID does not exist" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.884012 4717 scope.go:117] "RemoveContainer" containerID="372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15" Feb 17 15:14:21 crc kubenswrapper[4717]: E0217 15:14:21.887387 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15\": container with ID starting with 372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15 not found: ID does not exist" containerID="372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.887443 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15"} err="failed to get container status \"372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15\": rpc error: code = NotFound desc = could not find container \"372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15\": container with ID starting with 372544c954474124b4a01b6019c3d8be847d9551ec9aec540aa81647c2e71e15 not found: ID does not exist" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.887479 4717 scope.go:117] "RemoveContainer" containerID="f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e" Feb 17 15:14:21 crc kubenswrapper[4717]: E0217 15:14:21.893239 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e\": container with ID starting with f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e not found: ID does not exist" containerID="f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.893291 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e"} err="failed to get container status \"f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e\": rpc error: code = NotFound desc = could not find container \"f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e\": container with ID starting with f60c452cd11b56385aaeb2b88a271a36b030124bc0054f9e00e23d7d36c38f0e not found: ID does not exist" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.893325 4717 scope.go:117] "RemoveContainer" containerID="52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4" Feb 17 15:14:21 crc kubenswrapper[4717]: E0217 15:14:21.896374 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4\": container with ID starting with 52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4 not found: ID does not exist" containerID="52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.896417 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4"} err="failed to get container status \"52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4\": rpc error: code = NotFound desc = could not find container \"52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4\": container with ID starting with 52621f31817e35baa394c875b812f1ff5d865418594b9341e3b7b17b60fa34f4 not found: ID does not exist" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.897332 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9h8f\" (UniqueName: \"kubernetes.io/projected/cc6f7276-7e94-46fd-a61d-982ff14ea063-kube-api-access-f9h8f\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.897436 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc6f7276-7e94-46fd-a61d-982ff14ea063-log-httpd\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.897517 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc6f7276-7e94-46fd-a61d-982ff14ea063-run-httpd\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.897583 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.897625 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.897670 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.897708 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-scripts\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:21 crc kubenswrapper[4717]: I0217 15:14:21.897755 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-config-data\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.000188 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc6f7276-7e94-46fd-a61d-982ff14ea063-log-httpd\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.000256 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc6f7276-7e94-46fd-a61d-982ff14ea063-run-httpd\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.000283 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.000302 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.000333 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.000372 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-scripts\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.000415 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-config-data\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.000479 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9h8f\" (UniqueName: \"kubernetes.io/projected/cc6f7276-7e94-46fd-a61d-982ff14ea063-kube-api-access-f9h8f\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.001144 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc6f7276-7e94-46fd-a61d-982ff14ea063-log-httpd\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.001365 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc6f7276-7e94-46fd-a61d-982ff14ea063-run-httpd\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.004779 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.007635 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.008383 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-scripts\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.008609 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-config-data\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.009234 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.009499 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:22 crc kubenswrapper[4717]: E0217 15:14:22.010251 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-f9h8f], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="cc6f7276-7e94-46fd-a61d-982ff14ea063" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.020944 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9h8f\" (UniqueName: \"kubernetes.io/projected/cc6f7276-7e94-46fd-a61d-982ff14ea063-kube-api-access-f9h8f\") pod \"ceilometer-0\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.753748 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.765796 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.815046 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-combined-ca-bundle\") pod \"cc6f7276-7e94-46fd-a61d-982ff14ea063\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.815132 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc6f7276-7e94-46fd-a61d-982ff14ea063-run-httpd\") pod \"cc6f7276-7e94-46fd-a61d-982ff14ea063\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.815181 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-sg-core-conf-yaml\") pod \"cc6f7276-7e94-46fd-a61d-982ff14ea063\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.815564 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc6f7276-7e94-46fd-a61d-982ff14ea063-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cc6f7276-7e94-46fd-a61d-982ff14ea063" (UID: "cc6f7276-7e94-46fd-a61d-982ff14ea063"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.815654 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-config-data\") pod \"cc6f7276-7e94-46fd-a61d-982ff14ea063\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.816003 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9h8f\" (UniqueName: \"kubernetes.io/projected/cc6f7276-7e94-46fd-a61d-982ff14ea063-kube-api-access-f9h8f\") pod \"cc6f7276-7e94-46fd-a61d-982ff14ea063\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.816072 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-ceilometer-tls-certs\") pod \"cc6f7276-7e94-46fd-a61d-982ff14ea063\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.816218 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc6f7276-7e94-46fd-a61d-982ff14ea063-log-httpd\") pod \"cc6f7276-7e94-46fd-a61d-982ff14ea063\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.816245 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-scripts\") pod \"cc6f7276-7e94-46fd-a61d-982ff14ea063\" (UID: \"cc6f7276-7e94-46fd-a61d-982ff14ea063\") " Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.816559 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc6f7276-7e94-46fd-a61d-982ff14ea063-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cc6f7276-7e94-46fd-a61d-982ff14ea063" (UID: "cc6f7276-7e94-46fd-a61d-982ff14ea063"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.817237 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc6f7276-7e94-46fd-a61d-982ff14ea063-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.817262 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc6f7276-7e94-46fd-a61d-982ff14ea063-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.819989 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc6f7276-7e94-46fd-a61d-982ff14ea063" (UID: "cc6f7276-7e94-46fd-a61d-982ff14ea063"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.820051 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc6f7276-7e94-46fd-a61d-982ff14ea063-kube-api-access-f9h8f" (OuterVolumeSpecName: "kube-api-access-f9h8f") pod "cc6f7276-7e94-46fd-a61d-982ff14ea063" (UID: "cc6f7276-7e94-46fd-a61d-982ff14ea063"). InnerVolumeSpecName "kube-api-access-f9h8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.821572 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cc6f7276-7e94-46fd-a61d-982ff14ea063" (UID: "cc6f7276-7e94-46fd-a61d-982ff14ea063"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.821660 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-config-data" (OuterVolumeSpecName: "config-data") pod "cc6f7276-7e94-46fd-a61d-982ff14ea063" (UID: "cc6f7276-7e94-46fd-a61d-982ff14ea063"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.822365 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "cc6f7276-7e94-46fd-a61d-982ff14ea063" (UID: "cc6f7276-7e94-46fd-a61d-982ff14ea063"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.824598 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-scripts" (OuterVolumeSpecName: "scripts") pod "cc6f7276-7e94-46fd-a61d-982ff14ea063" (UID: "cc6f7276-7e94-46fd-a61d-982ff14ea063"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.919620 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.919655 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.919689 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9h8f\" (UniqueName: \"kubernetes.io/projected/cc6f7276-7e94-46fd-a61d-982ff14ea063-kube-api-access-f9h8f\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.919704 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.919716 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:22 crc kubenswrapper[4717]: I0217 15:14:22.919727 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc6f7276-7e94-46fd-a61d-982ff14ea063-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.762460 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.828797 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.869236 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.869305 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.874988 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.875171 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.878014 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.878269 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.878875 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.945352 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-config-data\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.945399 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.945439 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fa734ee-b468-4462-850d-9f347c991241-run-httpd\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.945466 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fa734ee-b468-4462-850d-9f347c991241-log-httpd\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.945485 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkghn\" (UniqueName: \"kubernetes.io/projected/6fa734ee-b468-4462-850d-9f347c991241-kube-api-access-dkghn\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.945513 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-scripts\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.945553 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:23 crc kubenswrapper[4717]: I0217 15:14:23.945576 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.053262 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.053309 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.053408 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-config-data\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.053429 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.053469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fa734ee-b468-4462-850d-9f347c991241-run-httpd\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.053492 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fa734ee-b468-4462-850d-9f347c991241-log-httpd\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.053509 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkghn\" (UniqueName: \"kubernetes.io/projected/6fa734ee-b468-4462-850d-9f347c991241-kube-api-access-dkghn\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.053537 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-scripts\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.054337 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fa734ee-b468-4462-850d-9f347c991241-log-httpd\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.054587 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fa734ee-b468-4462-850d-9f347c991241-run-httpd\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.059025 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.064478 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.070107 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.073992 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.074118 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-scripts\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.074713 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkghn\" (UniqueName: \"kubernetes.io/projected/6fa734ee-b468-4462-850d-9f347c991241-kube-api-access-dkghn\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.075735 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fa734ee-b468-4462-850d-9f347c991241-config-data\") pod \"ceilometer-0\" (UID: \"6fa734ee-b468-4462-850d-9f347c991241\") " pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.082804 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.203325 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.321592 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.464364 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1070b40-313c-468b-af8f-6d322571152d-logs\") pod \"a1070b40-313c-468b-af8f-6d322571152d\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.464491 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1070b40-313c-468b-af8f-6d322571152d-config-data\") pod \"a1070b40-313c-468b-af8f-6d322571152d\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.464542 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1070b40-313c-468b-af8f-6d322571152d-combined-ca-bundle\") pod \"a1070b40-313c-468b-af8f-6d322571152d\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.464582 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4vgk\" (UniqueName: \"kubernetes.io/projected/a1070b40-313c-468b-af8f-6d322571152d-kube-api-access-n4vgk\") pod \"a1070b40-313c-468b-af8f-6d322571152d\" (UID: \"a1070b40-313c-468b-af8f-6d322571152d\") " Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.465095 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1070b40-313c-468b-af8f-6d322571152d-logs" (OuterVolumeSpecName: "logs") pod "a1070b40-313c-468b-af8f-6d322571152d" (UID: "a1070b40-313c-468b-af8f-6d322571152d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.465232 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1070b40-313c-468b-af8f-6d322571152d-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.469667 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1070b40-313c-468b-af8f-6d322571152d-kube-api-access-n4vgk" (OuterVolumeSpecName: "kube-api-access-n4vgk") pod "a1070b40-313c-468b-af8f-6d322571152d" (UID: "a1070b40-313c-468b-af8f-6d322571152d"). InnerVolumeSpecName "kube-api-access-n4vgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.496449 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1070b40-313c-468b-af8f-6d322571152d-config-data" (OuterVolumeSpecName: "config-data") pod "a1070b40-313c-468b-af8f-6d322571152d" (UID: "a1070b40-313c-468b-af8f-6d322571152d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.502266 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1070b40-313c-468b-af8f-6d322571152d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1070b40-313c-468b-af8f-6d322571152d" (UID: "a1070b40-313c-468b-af8f-6d322571152d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.567053 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1070b40-313c-468b-af8f-6d322571152d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.567136 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1070b40-313c-468b-af8f-6d322571152d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.567153 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4vgk\" (UniqueName: \"kubernetes.io/projected/a1070b40-313c-468b-af8f-6d322571152d-kube-api-access-n4vgk\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.660399 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 15:14:24 crc kubenswrapper[4717]: W0217 15:14:24.664726 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fa734ee_b468_4462_850d_9f347c991241.slice/crio-2440c092c5268344da6b282e5861fbddb5a88c32aaf284544f78acaa5486b0a6 WatchSource:0}: Error finding container 2440c092c5268344da6b282e5861fbddb5a88c32aaf284544f78acaa5486b0a6: Status 404 returned error can't find the container with id 2440c092c5268344da6b282e5861fbddb5a88c32aaf284544f78acaa5486b0a6 Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.667802 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.781695 4717 generic.go:334] "Generic (PLEG): container finished" podID="a1070b40-313c-468b-af8f-6d322571152d" containerID="e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d" exitCode=0 Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.782145 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1070b40-313c-468b-af8f-6d322571152d","Type":"ContainerDied","Data":"e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d"} Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.782179 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1070b40-313c-468b-af8f-6d322571152d","Type":"ContainerDied","Data":"b5068eb7e9120bd35a32ceb1c4e556e5935736b4708629f5415bfbc5a115f705"} Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.782207 4717 scope.go:117] "RemoveContainer" containerID="e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.782394 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.785823 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fa734ee-b468-4462-850d-9f347c991241","Type":"ContainerStarted","Data":"2440c092c5268344da6b282e5861fbddb5a88c32aaf284544f78acaa5486b0a6"} Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.812249 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.823703 4717 scope.go:117] "RemoveContainer" containerID="4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.844255 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.872062 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.876940 4717 scope.go:117] "RemoveContainer" containerID="e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d" Feb 17 15:14:24 crc kubenswrapper[4717]: E0217 15:14:24.877462 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d\": container with ID starting with e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d not found: ID does not exist" containerID="e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.877503 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d"} err="failed to get container status \"e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d\": rpc error: code = NotFound desc = could not find container \"e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d\": container with ID starting with e633bee56f9a403ef56c9e492ef5f5e5093ccb52578189941b5791b2ccb8536d not found: ID does not exist" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.877530 4717 scope.go:117] "RemoveContainer" containerID="4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3" Feb 17 15:14:24 crc kubenswrapper[4717]: E0217 15:14:24.877914 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3\": container with ID starting with 4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3 not found: ID does not exist" containerID="4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.877935 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3"} err="failed to get container status \"4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3\": rpc error: code = NotFound desc = could not find container \"4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3\": container with ID starting with 4e3edf340d3344f6302d43e4e78a854ff55ea7f030b1d9f5d983d3a84547d1d3 not found: ID does not exist" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.887496 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 15:14:24 crc kubenswrapper[4717]: E0217 15:14:24.887949 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1070b40-313c-468b-af8f-6d322571152d" containerName="nova-api-log" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.887969 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1070b40-313c-468b-af8f-6d322571152d" containerName="nova-api-log" Feb 17 15:14:24 crc kubenswrapper[4717]: E0217 15:14:24.887990 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1070b40-313c-468b-af8f-6d322571152d" containerName="nova-api-api" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.887997 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1070b40-313c-468b-af8f-6d322571152d" containerName="nova-api-api" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.888206 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1070b40-313c-468b-af8f-6d322571152d" containerName="nova-api-api" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.888229 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1070b40-313c-468b-af8f-6d322571152d" containerName="nova-api-log" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.889156 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.896229 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.896292 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.896968 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.916227 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.980565 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-config-data\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.980836 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.981057 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.981276 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tswgc\" (UniqueName: \"kubernetes.io/projected/c6ad65d7-0a8a-4022-885f-792da81b62df-kube-api-access-tswgc\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.981430 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6ad65d7-0a8a-4022-885f-792da81b62df-logs\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.981576 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-public-tls-certs\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.989286 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-qzr8m"] Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.990998 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.994499 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.994768 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 17 15:14:24 crc kubenswrapper[4717]: I0217 15:14:24.997861 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-qzr8m"] Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.083480 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.083534 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qzr8m\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.083568 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tswgc\" (UniqueName: \"kubernetes.io/projected/c6ad65d7-0a8a-4022-885f-792da81b62df-kube-api-access-tswgc\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.083613 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6ad65d7-0a8a-4022-885f-792da81b62df-logs\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.083666 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-public-tls-certs\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.083740 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-config-data\") pod \"nova-cell1-cell-mapping-qzr8m\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.083774 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-scripts\") pod \"nova-cell1-cell-mapping-qzr8m\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.083820 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-config-data\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.083844 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.083887 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg8c7\" (UniqueName: \"kubernetes.io/projected/94b2abfd-5466-4810-b0ba-dfd4a956549b-kube-api-access-fg8c7\") pod \"nova-cell1-cell-mapping-qzr8m\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.084000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6ad65d7-0a8a-4022-885f-792da81b62df-logs\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.089535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.089691 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.089773 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-config-data\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.102275 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-public-tls-certs\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.102319 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tswgc\" (UniqueName: \"kubernetes.io/projected/c6ad65d7-0a8a-4022-885f-792da81b62df-kube-api-access-tswgc\") pod \"nova-api-0\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.185279 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg8c7\" (UniqueName: \"kubernetes.io/projected/94b2abfd-5466-4810-b0ba-dfd4a956549b-kube-api-access-fg8c7\") pod \"nova-cell1-cell-mapping-qzr8m\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.185378 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qzr8m\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.185515 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-config-data\") pod \"nova-cell1-cell-mapping-qzr8m\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.185548 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-scripts\") pod \"nova-cell1-cell-mapping-qzr8m\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.189232 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-scripts\") pod \"nova-cell1-cell-mapping-qzr8m\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.189846 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qzr8m\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.190171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-config-data\") pod \"nova-cell1-cell-mapping-qzr8m\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.203715 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg8c7\" (UniqueName: \"kubernetes.io/projected/94b2abfd-5466-4810-b0ba-dfd4a956549b-kube-api-access-fg8c7\") pod \"nova-cell1-cell-mapping-qzr8m\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.210134 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.346247 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:25 crc kubenswrapper[4717]: W0217 15:14:25.689280 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6ad65d7_0a8a_4022_885f_792da81b62df.slice/crio-e09b169dfa1c64bb818c21a09a674d2e6074d8ae020ee512556942b5ddd6e0d6 WatchSource:0}: Error finding container e09b169dfa1c64bb818c21a09a674d2e6074d8ae020ee512556942b5ddd6e0d6: Status 404 returned error can't find the container with id e09b169dfa1c64bb818c21a09a674d2e6074d8ae020ee512556942b5ddd6e0d6 Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.691779 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.804465 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fa734ee-b468-4462-850d-9f347c991241","Type":"ContainerStarted","Data":"b75242d6dba70863900b905a20f109d5144dd15b165e92cfd36cb4090516ba4c"} Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.806488 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6ad65d7-0a8a-4022-885f-792da81b62df","Type":"ContainerStarted","Data":"e09b169dfa1c64bb818c21a09a674d2e6074d8ae020ee512556942b5ddd6e0d6"} Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.862817 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1070b40-313c-468b-af8f-6d322571152d" path="/var/lib/kubelet/pods/a1070b40-313c-468b-af8f-6d322571152d/volumes" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.863573 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc6f7276-7e94-46fd-a61d-982ff14ea063" path="/var/lib/kubelet/pods/cc6f7276-7e94-46fd-a61d-982ff14ea063/volumes" Feb 17 15:14:25 crc kubenswrapper[4717]: I0217 15:14:25.864106 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-qzr8m"] Feb 17 15:14:26 crc kubenswrapper[4717]: I0217 15:14:26.816009 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fa734ee-b468-4462-850d-9f347c991241","Type":"ContainerStarted","Data":"7dbe0b2e2dd5541d393a576ad9a817e44e40556a05f8f6bbcb4bbe926713c3fa"} Feb 17 15:14:26 crc kubenswrapper[4717]: I0217 15:14:26.816367 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fa734ee-b468-4462-850d-9f347c991241","Type":"ContainerStarted","Data":"a24d9a0240bad8eafefb4b39f0bb6607c2b09be342337c925efc3b6ec2e3851d"} Feb 17 15:14:26 crc kubenswrapper[4717]: I0217 15:14:26.817973 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qzr8m" event={"ID":"94b2abfd-5466-4810-b0ba-dfd4a956549b","Type":"ContainerStarted","Data":"03d544f8840202792cf82407d5091b13d23e943807fe029aee1d4cf98070d1ab"} Feb 17 15:14:26 crc kubenswrapper[4717]: I0217 15:14:26.818250 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qzr8m" event={"ID":"94b2abfd-5466-4810-b0ba-dfd4a956549b","Type":"ContainerStarted","Data":"dbe1926a31c6ab32059d3dcfb02514467b4176b3a6e442bd3cdec2b6fb3999bc"} Feb 17 15:14:26 crc kubenswrapper[4717]: I0217 15:14:26.821269 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6ad65d7-0a8a-4022-885f-792da81b62df","Type":"ContainerStarted","Data":"4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18"} Feb 17 15:14:26 crc kubenswrapper[4717]: I0217 15:14:26.821302 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6ad65d7-0a8a-4022-885f-792da81b62df","Type":"ContainerStarted","Data":"fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c"} Feb 17 15:14:26 crc kubenswrapper[4717]: I0217 15:14:26.838316 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-qzr8m" podStartSLOduration=2.838297769 podStartE2EDuration="2.838297769s" podCreationTimestamp="2026-02-17 15:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:14:26.834900632 +0000 UTC m=+1333.250741108" watchObservedRunningTime="2026-02-17 15:14:26.838297769 +0000 UTC m=+1333.254138245" Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.227438 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.261613 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.261586987 podStartE2EDuration="4.261586987s" podCreationTimestamp="2026-02-17 15:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:14:26.868530257 +0000 UTC m=+1333.284370743" watchObservedRunningTime="2026-02-17 15:14:28.261586987 +0000 UTC m=+1334.677427473" Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.308641 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-gcs9s"] Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.309247 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" podUID="8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" containerName="dnsmasq-dns" containerID="cri-o://78b83d7a05fc7491d26d0f183c14475d542e4f23c36f1d2b6f518e567e67f749" gracePeriod=10 Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.840686 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fa734ee-b468-4462-850d-9f347c991241","Type":"ContainerStarted","Data":"03ef5b2a51cb53f9ba778884cd23c3231a361872b93e1873265fd114b8207c93"} Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.841838 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.843837 4717 generic.go:334] "Generic (PLEG): container finished" podID="8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" containerID="78b83d7a05fc7491d26d0f183c14475d542e4f23c36f1d2b6f518e567e67f749" exitCode=0 Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.843875 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" event={"ID":"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5","Type":"ContainerDied","Data":"78b83d7a05fc7491d26d0f183c14475d542e4f23c36f1d2b6f518e567e67f749"} Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.843901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" event={"ID":"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5","Type":"ContainerDied","Data":"04ccbf748503fe77571c38f8134237bee9e0f2905a43e1affebc3d74fbd01408"} Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.843915 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04ccbf748503fe77571c38f8134237bee9e0f2905a43e1affebc3d74fbd01408" Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.879027 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.897580 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.810917526 podStartE2EDuration="5.897563443s" podCreationTimestamp="2026-02-17 15:14:23 +0000 UTC" firstStartedPulling="2026-02-17 15:14:24.667587583 +0000 UTC m=+1331.083428049" lastFinishedPulling="2026-02-17 15:14:27.75423349 +0000 UTC m=+1334.170073966" observedRunningTime="2026-02-17 15:14:28.8643358 +0000 UTC m=+1335.280176296" watchObservedRunningTime="2026-02-17 15:14:28.897563443 +0000 UTC m=+1335.313403919" Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.974045 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-ovsdbserver-nb\") pod \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.974252 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-config\") pod \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.974309 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-ovsdbserver-sb\") pod \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.974380 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-dns-svc\") pod \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.974404 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghqtr\" (UniqueName: \"kubernetes.io/projected/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-kube-api-access-ghqtr\") pod \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.974460 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-dns-swift-storage-0\") pod \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\" (UID: \"8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5\") " Feb 17 15:14:28 crc kubenswrapper[4717]: I0217 15:14:28.981122 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-kube-api-access-ghqtr" (OuterVolumeSpecName: "kube-api-access-ghqtr") pod "8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" (UID: "8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5"). InnerVolumeSpecName "kube-api-access-ghqtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.035719 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" (UID: "8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.038934 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" (UID: "8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.039205 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-config" (OuterVolumeSpecName: "config") pod "8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" (UID: "8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.041814 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" (UID: "8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.045153 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" (UID: "8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.076461 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.076604 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.076681 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.076772 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghqtr\" (UniqueName: \"kubernetes.io/projected/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-kube-api-access-ghqtr\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.076843 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.076914 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.875969 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-gcs9s" Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.927665 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-gcs9s"] Feb 17 15:14:29 crc kubenswrapper[4717]: I0217 15:14:29.934883 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-gcs9s"] Feb 17 15:14:31 crc kubenswrapper[4717]: I0217 15:14:31.859673 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" path="/var/lib/kubelet/pods/8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5/volumes" Feb 17 15:14:31 crc kubenswrapper[4717]: I0217 15:14:31.894662 4717 generic.go:334] "Generic (PLEG): container finished" podID="94b2abfd-5466-4810-b0ba-dfd4a956549b" containerID="03d544f8840202792cf82407d5091b13d23e943807fe029aee1d4cf98070d1ab" exitCode=0 Feb 17 15:14:31 crc kubenswrapper[4717]: I0217 15:14:31.894705 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qzr8m" event={"ID":"94b2abfd-5466-4810-b0ba-dfd4a956549b","Type":"ContainerDied","Data":"03d544f8840202792cf82407d5091b13d23e943807fe029aee1d4cf98070d1ab"} Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.303358 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.390611 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-scripts\") pod \"94b2abfd-5466-4810-b0ba-dfd4a956549b\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.391136 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-config-data\") pod \"94b2abfd-5466-4810-b0ba-dfd4a956549b\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.391202 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-combined-ca-bundle\") pod \"94b2abfd-5466-4810-b0ba-dfd4a956549b\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.391357 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg8c7\" (UniqueName: \"kubernetes.io/projected/94b2abfd-5466-4810-b0ba-dfd4a956549b-kube-api-access-fg8c7\") pod \"94b2abfd-5466-4810-b0ba-dfd4a956549b\" (UID: \"94b2abfd-5466-4810-b0ba-dfd4a956549b\") " Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.397871 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-scripts" (OuterVolumeSpecName: "scripts") pod "94b2abfd-5466-4810-b0ba-dfd4a956549b" (UID: "94b2abfd-5466-4810-b0ba-dfd4a956549b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.398488 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b2abfd-5466-4810-b0ba-dfd4a956549b-kube-api-access-fg8c7" (OuterVolumeSpecName: "kube-api-access-fg8c7") pod "94b2abfd-5466-4810-b0ba-dfd4a956549b" (UID: "94b2abfd-5466-4810-b0ba-dfd4a956549b"). InnerVolumeSpecName "kube-api-access-fg8c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.423877 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94b2abfd-5466-4810-b0ba-dfd4a956549b" (UID: "94b2abfd-5466-4810-b0ba-dfd4a956549b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.425466 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-config-data" (OuterVolumeSpecName: "config-data") pod "94b2abfd-5466-4810-b0ba-dfd4a956549b" (UID: "94b2abfd-5466-4810-b0ba-dfd4a956549b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.493951 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.493980 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.493993 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg8c7\" (UniqueName: \"kubernetes.io/projected/94b2abfd-5466-4810-b0ba-dfd4a956549b-kube-api-access-fg8c7\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.494001 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94b2abfd-5466-4810-b0ba-dfd4a956549b-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.911886 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qzr8m" event={"ID":"94b2abfd-5466-4810-b0ba-dfd4a956549b","Type":"ContainerDied","Data":"dbe1926a31c6ab32059d3dcfb02514467b4176b3a6e442bd3cdec2b6fb3999bc"} Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.911925 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbe1926a31c6ab32059d3dcfb02514467b4176b3a6e442bd3cdec2b6fb3999bc" Feb 17 15:14:33 crc kubenswrapper[4717]: I0217 15:14:33.911993 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qzr8m" Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.098110 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.098357 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c6ad65d7-0a8a-4022-885f-792da81b62df" containerName="nova-api-log" containerID="cri-o://fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c" gracePeriod=30 Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.098412 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c6ad65d7-0a8a-4022-885f-792da81b62df" containerName="nova-api-api" containerID="cri-o://4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18" gracePeriod=30 Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.110679 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.110870 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d34708cb-eeda-449c-83ff-ea509cc7dbd1" containerName="nova-scheduler-scheduler" containerID="cri-o://c2f167737850bb7a83afbea1f3313c44efcb7cef6d13a8e037b33ab648a7659b" gracePeriod=30 Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.131869 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.132147 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerName="nova-metadata-log" containerID="cri-o://aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384" gracePeriod=30 Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.132222 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerName="nova-metadata-metadata" containerID="cri-o://6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06" gracePeriod=30 Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.808377 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.921341 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-combined-ca-bundle\") pod \"c6ad65d7-0a8a-4022-885f-792da81b62df\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.921390 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tswgc\" (UniqueName: \"kubernetes.io/projected/c6ad65d7-0a8a-4022-885f-792da81b62df-kube-api-access-tswgc\") pod \"c6ad65d7-0a8a-4022-885f-792da81b62df\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.921499 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-internal-tls-certs\") pod \"c6ad65d7-0a8a-4022-885f-792da81b62df\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.921542 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-config-data\") pod \"c6ad65d7-0a8a-4022-885f-792da81b62df\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.921567 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-public-tls-certs\") pod \"c6ad65d7-0a8a-4022-885f-792da81b62df\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.921613 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6ad65d7-0a8a-4022-885f-792da81b62df-logs\") pod \"c6ad65d7-0a8a-4022-885f-792da81b62df\" (UID: \"c6ad65d7-0a8a-4022-885f-792da81b62df\") " Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.922940 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ad65d7-0a8a-4022-885f-792da81b62df-logs" (OuterVolumeSpecName: "logs") pod "c6ad65d7-0a8a-4022-885f-792da81b62df" (UID: "c6ad65d7-0a8a-4022-885f-792da81b62df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.927548 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ad65d7-0a8a-4022-885f-792da81b62df-kube-api-access-tswgc" (OuterVolumeSpecName: "kube-api-access-tswgc") pod "c6ad65d7-0a8a-4022-885f-792da81b62df" (UID: "c6ad65d7-0a8a-4022-885f-792da81b62df"). InnerVolumeSpecName "kube-api-access-tswgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.927863 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerID="aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384" exitCode=143 Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.927944 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ea023995-3d37-4a3b-b22b-9903c7e21fc6","Type":"ContainerDied","Data":"aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384"} Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.929370 4717 generic.go:334] "Generic (PLEG): container finished" podID="c6ad65d7-0a8a-4022-885f-792da81b62df" containerID="4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18" exitCode=0 Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.929386 4717 generic.go:334] "Generic (PLEG): container finished" podID="c6ad65d7-0a8a-4022-885f-792da81b62df" containerID="fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c" exitCode=143 Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.929418 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6ad65d7-0a8a-4022-885f-792da81b62df","Type":"ContainerDied","Data":"4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18"} Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.929434 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6ad65d7-0a8a-4022-885f-792da81b62df","Type":"ContainerDied","Data":"fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c"} Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.929444 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6ad65d7-0a8a-4022-885f-792da81b62df","Type":"ContainerDied","Data":"e09b169dfa1c64bb818c21a09a674d2e6074d8ae020ee512556942b5ddd6e0d6"} Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.929459 4717 scope.go:117] "RemoveContainer" containerID="4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18" Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.929621 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.933336 4717 generic.go:334] "Generic (PLEG): container finished" podID="d34708cb-eeda-449c-83ff-ea509cc7dbd1" containerID="c2f167737850bb7a83afbea1f3313c44efcb7cef6d13a8e037b33ab648a7659b" exitCode=0 Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.933356 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d34708cb-eeda-449c-83ff-ea509cc7dbd1","Type":"ContainerDied","Data":"c2f167737850bb7a83afbea1f3313c44efcb7cef6d13a8e037b33ab648a7659b"} Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.980966 4717 scope.go:117] "RemoveContainer" containerID="fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c" Feb 17 15:14:34 crc kubenswrapper[4717]: I0217 15:14:34.990386 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6ad65d7-0a8a-4022-885f-792da81b62df" (UID: "c6ad65d7-0a8a-4022-885f-792da81b62df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.004994 4717 scope.go:117] "RemoveContainer" containerID="4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18" Feb 17 15:14:35 crc kubenswrapper[4717]: E0217 15:14:35.005443 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18\": container with ID starting with 4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18 not found: ID does not exist" containerID="4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.005469 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18"} err="failed to get container status \"4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18\": rpc error: code = NotFound desc = could not find container \"4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18\": container with ID starting with 4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18 not found: ID does not exist" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.005488 4717 scope.go:117] "RemoveContainer" containerID="fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c" Feb 17 15:14:35 crc kubenswrapper[4717]: E0217 15:14:35.006351 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c\": container with ID starting with fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c not found: ID does not exist" containerID="fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.006380 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c"} err="failed to get container status \"fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c\": rpc error: code = NotFound desc = could not find container \"fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c\": container with ID starting with fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c not found: ID does not exist" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.006393 4717 scope.go:117] "RemoveContainer" containerID="4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.006582 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c6ad65d7-0a8a-4022-885f-792da81b62df" (UID: "c6ad65d7-0a8a-4022-885f-792da81b62df"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.006744 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18"} err="failed to get container status \"4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18\": rpc error: code = NotFound desc = could not find container \"4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18\": container with ID starting with 4cb42f5c230e3c85cd3f452de69222d1822487556fbecbf387b6382203dd1c18 not found: ID does not exist" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.006785 4717 scope.go:117] "RemoveContainer" containerID="fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.007270 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c"} err="failed to get container status \"fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c\": rpc error: code = NotFound desc = could not find container \"fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c\": container with ID starting with fc9e81714e831fc024c6c0c0ce3d7a59ce7d3a6aeccaba80d061c5342bd96f0c not found: ID does not exist" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.008563 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c6ad65d7-0a8a-4022-885f-792da81b62df" (UID: "c6ad65d7-0a8a-4022-885f-792da81b62df"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.013273 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-config-data" (OuterVolumeSpecName: "config-data") pod "c6ad65d7-0a8a-4022-885f-792da81b62df" (UID: "c6ad65d7-0a8a-4022-885f-792da81b62df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.023526 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.023568 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.023581 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.023594 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6ad65d7-0a8a-4022-885f-792da81b62df-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.023609 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ad65d7-0a8a-4022-885f-792da81b62df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.023619 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tswgc\" (UniqueName: \"kubernetes.io/projected/c6ad65d7-0a8a-4022-885f-792da81b62df-kube-api-access-tswgc\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.034021 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.125131 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxvfl\" (UniqueName: \"kubernetes.io/projected/d34708cb-eeda-449c-83ff-ea509cc7dbd1-kube-api-access-fxvfl\") pod \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\" (UID: \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\") " Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.125816 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34708cb-eeda-449c-83ff-ea509cc7dbd1-config-data\") pod \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\" (UID: \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\") " Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.126755 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34708cb-eeda-449c-83ff-ea509cc7dbd1-combined-ca-bundle\") pod \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\" (UID: \"d34708cb-eeda-449c-83ff-ea509cc7dbd1\") " Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.129606 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34708cb-eeda-449c-83ff-ea509cc7dbd1-kube-api-access-fxvfl" (OuterVolumeSpecName: "kube-api-access-fxvfl") pod "d34708cb-eeda-449c-83ff-ea509cc7dbd1" (UID: "d34708cb-eeda-449c-83ff-ea509cc7dbd1"). InnerVolumeSpecName "kube-api-access-fxvfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.149624 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34708cb-eeda-449c-83ff-ea509cc7dbd1-config-data" (OuterVolumeSpecName: "config-data") pod "d34708cb-eeda-449c-83ff-ea509cc7dbd1" (UID: "d34708cb-eeda-449c-83ff-ea509cc7dbd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.151246 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34708cb-eeda-449c-83ff-ea509cc7dbd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d34708cb-eeda-449c-83ff-ea509cc7dbd1" (UID: "d34708cb-eeda-449c-83ff-ea509cc7dbd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.229528 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxvfl\" (UniqueName: \"kubernetes.io/projected/d34708cb-eeda-449c-83ff-ea509cc7dbd1-kube-api-access-fxvfl\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.229568 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34708cb-eeda-449c-83ff-ea509cc7dbd1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.229579 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34708cb-eeda-449c-83ff-ea509cc7dbd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.300618 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.308642 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.325808 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 15:14:35 crc kubenswrapper[4717]: E0217 15:14:35.326255 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b2abfd-5466-4810-b0ba-dfd4a956549b" containerName="nova-manage" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.326271 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b2abfd-5466-4810-b0ba-dfd4a956549b" containerName="nova-manage" Feb 17 15:14:35 crc kubenswrapper[4717]: E0217 15:14:35.326291 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34708cb-eeda-449c-83ff-ea509cc7dbd1" containerName="nova-scheduler-scheduler" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.326298 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34708cb-eeda-449c-83ff-ea509cc7dbd1" containerName="nova-scheduler-scheduler" Feb 17 15:14:35 crc kubenswrapper[4717]: E0217 15:14:35.326312 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ad65d7-0a8a-4022-885f-792da81b62df" containerName="nova-api-log" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.326320 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ad65d7-0a8a-4022-885f-792da81b62df" containerName="nova-api-log" Feb 17 15:14:35 crc kubenswrapper[4717]: E0217 15:14:35.326330 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ad65d7-0a8a-4022-885f-792da81b62df" containerName="nova-api-api" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.326335 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ad65d7-0a8a-4022-885f-792da81b62df" containerName="nova-api-api" Feb 17 15:14:35 crc kubenswrapper[4717]: E0217 15:14:35.326352 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" containerName="init" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.326358 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" containerName="init" Feb 17 15:14:35 crc kubenswrapper[4717]: E0217 15:14:35.326369 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" containerName="dnsmasq-dns" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.326374 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" containerName="dnsmasq-dns" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.326559 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6d77f0-490e-4a0e-bd37-1b061ff0b8e5" containerName="dnsmasq-dns" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.326573 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34708cb-eeda-449c-83ff-ea509cc7dbd1" containerName="nova-scheduler-scheduler" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.326586 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ad65d7-0a8a-4022-885f-792da81b62df" containerName="nova-api-api" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.326596 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ad65d7-0a8a-4022-885f-792da81b62df" containerName="nova-api-log" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.326609 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b2abfd-5466-4810-b0ba-dfd4a956549b" containerName="nova-manage" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.327586 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.332457 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.332607 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.332842 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.340974 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.436008 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a8f201-b55e-47e3-9d85-18d73631f9ae-public-tls-certs\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.436051 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt4j7\" (UniqueName: \"kubernetes.io/projected/a9a8f201-b55e-47e3-9d85-18d73631f9ae-kube-api-access-vt4j7\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.436100 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9a8f201-b55e-47e3-9d85-18d73631f9ae-logs\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.436123 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a8f201-b55e-47e3-9d85-18d73631f9ae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.436230 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a8f201-b55e-47e3-9d85-18d73631f9ae-config-data\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.436261 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a8f201-b55e-47e3-9d85-18d73631f9ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.537681 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a8f201-b55e-47e3-9d85-18d73631f9ae-config-data\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.537743 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a8f201-b55e-47e3-9d85-18d73631f9ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.537836 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a8f201-b55e-47e3-9d85-18d73631f9ae-public-tls-certs\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.537869 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4j7\" (UniqueName: \"kubernetes.io/projected/a9a8f201-b55e-47e3-9d85-18d73631f9ae-kube-api-access-vt4j7\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.537895 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9a8f201-b55e-47e3-9d85-18d73631f9ae-logs\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.537913 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a8f201-b55e-47e3-9d85-18d73631f9ae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.540862 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9a8f201-b55e-47e3-9d85-18d73631f9ae-logs\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.541915 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a8f201-b55e-47e3-9d85-18d73631f9ae-public-tls-certs\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.544457 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a8f201-b55e-47e3-9d85-18d73631f9ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.547308 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a8f201-b55e-47e3-9d85-18d73631f9ae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.551016 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a8f201-b55e-47e3-9d85-18d73631f9ae-config-data\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.568143 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt4j7\" (UniqueName: \"kubernetes.io/projected/a9a8f201-b55e-47e3-9d85-18d73631f9ae-kube-api-access-vt4j7\") pod \"nova-api-0\" (UID: \"a9a8f201-b55e-47e3-9d85-18d73631f9ae\") " pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.646419 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.856886 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ad65d7-0a8a-4022-885f-792da81b62df" path="/var/lib/kubelet/pods/c6ad65d7-0a8a-4022-885f-792da81b62df/volumes" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.947152 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d34708cb-eeda-449c-83ff-ea509cc7dbd1","Type":"ContainerDied","Data":"d364e92cd1f65dd39a1088b54af8e18a6dcd8fdbd2fc59f1c2a0e4516733620d"} Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.947205 4717 scope.go:117] "RemoveContainer" containerID="c2f167737850bb7a83afbea1f3313c44efcb7cef6d13a8e037b33ab648a7659b" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.947327 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.972139 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:14:35 crc kubenswrapper[4717]: I0217 15:14:35.989663 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.006426 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.007852 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.009904 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.016533 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:14:36 crc kubenswrapper[4717]: W0217 15:14:36.081563 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9a8f201_b55e_47e3_9d85_18d73631f9ae.slice/crio-8f36639c1f4d43ce02e9e5356d831671aeb896d0a5fc0eb608b67a44cdd64a9c WatchSource:0}: Error finding container 8f36639c1f4d43ce02e9e5356d831671aeb896d0a5fc0eb608b67a44cdd64a9c: Status 404 returned error can't find the container with id 8f36639c1f4d43ce02e9e5356d831671aeb896d0a5fc0eb608b67a44cdd64a9c Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.082893 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.148385 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a119d602-10c8-4b7b-aa61-77774c7f024f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a119d602-10c8-4b7b-aa61-77774c7f024f\") " pod="openstack/nova-scheduler-0" Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.148750 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9flrr\" (UniqueName: \"kubernetes.io/projected/a119d602-10c8-4b7b-aa61-77774c7f024f-kube-api-access-9flrr\") pod \"nova-scheduler-0\" (UID: \"a119d602-10c8-4b7b-aa61-77774c7f024f\") " pod="openstack/nova-scheduler-0" Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.148796 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a119d602-10c8-4b7b-aa61-77774c7f024f-config-data\") pod \"nova-scheduler-0\" (UID: \"a119d602-10c8-4b7b-aa61-77774c7f024f\") " pod="openstack/nova-scheduler-0" Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.250247 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a119d602-10c8-4b7b-aa61-77774c7f024f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a119d602-10c8-4b7b-aa61-77774c7f024f\") " pod="openstack/nova-scheduler-0" Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.250342 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9flrr\" (UniqueName: \"kubernetes.io/projected/a119d602-10c8-4b7b-aa61-77774c7f024f-kube-api-access-9flrr\") pod \"nova-scheduler-0\" (UID: \"a119d602-10c8-4b7b-aa61-77774c7f024f\") " pod="openstack/nova-scheduler-0" Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.250392 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a119d602-10c8-4b7b-aa61-77774c7f024f-config-data\") pod \"nova-scheduler-0\" (UID: \"a119d602-10c8-4b7b-aa61-77774c7f024f\") " pod="openstack/nova-scheduler-0" Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.267170 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a119d602-10c8-4b7b-aa61-77774c7f024f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a119d602-10c8-4b7b-aa61-77774c7f024f\") " pod="openstack/nova-scheduler-0" Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.270858 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a119d602-10c8-4b7b-aa61-77774c7f024f-config-data\") pod \"nova-scheduler-0\" (UID: \"a119d602-10c8-4b7b-aa61-77774c7f024f\") " pod="openstack/nova-scheduler-0" Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.285652 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9flrr\" (UniqueName: \"kubernetes.io/projected/a119d602-10c8-4b7b-aa61-77774c7f024f-kube-api-access-9flrr\") pod \"nova-scheduler-0\" (UID: \"a119d602-10c8-4b7b-aa61-77774c7f024f\") " pod="openstack/nova-scheduler-0" Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.332577 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.818442 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.958480 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a119d602-10c8-4b7b-aa61-77774c7f024f","Type":"ContainerStarted","Data":"497f229200d6dfb36e505e8b0373ce839dadf33937cc3436f16c9cb9c53ea0b9"} Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.959932 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9a8f201-b55e-47e3-9d85-18d73631f9ae","Type":"ContainerStarted","Data":"b752adf8834d24dba7b6db7cc9b1625ba313322f31c11c0316f0a9b19fcc88b0"} Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.959985 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9a8f201-b55e-47e3-9d85-18d73631f9ae","Type":"ContainerStarted","Data":"e01c63d13c5f90edd25d69f913e4ce70ed53a3df54bf05c5cf3b4324ef00877a"} Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.959997 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9a8f201-b55e-47e3-9d85-18d73631f9ae","Type":"ContainerStarted","Data":"8f36639c1f4d43ce02e9e5356d831671aeb896d0a5fc0eb608b67a44cdd64a9c"} Feb 17 15:14:36 crc kubenswrapper[4717]: I0217 15:14:36.993905 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.993885105 podStartE2EDuration="1.993885105s" podCreationTimestamp="2026-02-17 15:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:14:36.977185172 +0000 UTC m=+1343.393025658" watchObservedRunningTime="2026-02-17 15:14:36.993885105 +0000 UTC m=+1343.409725571" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.270737 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:39308->10.217.0.197:8775: read: connection reset by peer" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.270802 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:39296->10.217.0.197:8775: read: connection reset by peer" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.748743 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.865887 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34708cb-eeda-449c-83ff-ea509cc7dbd1" path="/var/lib/kubelet/pods/d34708cb-eeda-449c-83ff-ea509cc7dbd1/volumes" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.881628 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zdqz\" (UniqueName: \"kubernetes.io/projected/ea023995-3d37-4a3b-b22b-9903c7e21fc6-kube-api-access-7zdqz\") pod \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.881837 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-nova-metadata-tls-certs\") pod \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.881913 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-config-data\") pod \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.881984 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea023995-3d37-4a3b-b22b-9903c7e21fc6-logs\") pod \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.882031 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-combined-ca-bundle\") pod \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\" (UID: \"ea023995-3d37-4a3b-b22b-9903c7e21fc6\") " Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.883420 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea023995-3d37-4a3b-b22b-9903c7e21fc6-logs" (OuterVolumeSpecName: "logs") pod "ea023995-3d37-4a3b-b22b-9903c7e21fc6" (UID: "ea023995-3d37-4a3b-b22b-9903c7e21fc6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.890033 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea023995-3d37-4a3b-b22b-9903c7e21fc6-kube-api-access-7zdqz" (OuterVolumeSpecName: "kube-api-access-7zdqz") pod "ea023995-3d37-4a3b-b22b-9903c7e21fc6" (UID: "ea023995-3d37-4a3b-b22b-9903c7e21fc6"). InnerVolumeSpecName "kube-api-access-7zdqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.921289 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea023995-3d37-4a3b-b22b-9903c7e21fc6" (UID: "ea023995-3d37-4a3b-b22b-9903c7e21fc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.921378 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-config-data" (OuterVolumeSpecName: "config-data") pod "ea023995-3d37-4a3b-b22b-9903c7e21fc6" (UID: "ea023995-3d37-4a3b-b22b-9903c7e21fc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.955576 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ea023995-3d37-4a3b-b22b-9903c7e21fc6" (UID: "ea023995-3d37-4a3b-b22b-9903c7e21fc6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.973582 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a119d602-10c8-4b7b-aa61-77774c7f024f","Type":"ContainerStarted","Data":"3458a531859166d6ef2e0db000349ba564d829bcbde5b5b05a4ec5c47e3393a3"} Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.977354 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerID="6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06" exitCode=0 Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.977506 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.978110 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ea023995-3d37-4a3b-b22b-9903c7e21fc6","Type":"ContainerDied","Data":"6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06"} Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.978199 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ea023995-3d37-4a3b-b22b-9903c7e21fc6","Type":"ContainerDied","Data":"abaf49433953181dccbe67ad76a14c6506457276c934616840c6133a5ed8dca3"} Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.978219 4717 scope.go:117] "RemoveContainer" containerID="6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.983948 4717 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.983971 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.983981 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea023995-3d37-4a3b-b22b-9903c7e21fc6-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.983992 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea023995-3d37-4a3b-b22b-9903c7e21fc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.984003 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zdqz\" (UniqueName: \"kubernetes.io/projected/ea023995-3d37-4a3b-b22b-9903c7e21fc6-kube-api-access-7zdqz\") on node \"crc\" DevicePath \"\"" Feb 17 15:14:37 crc kubenswrapper[4717]: I0217 15:14:37.993283 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9932615030000003 podStartE2EDuration="2.993261503s" podCreationTimestamp="2026-02-17 15:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:14:37.986433489 +0000 UTC m=+1344.402273965" watchObservedRunningTime="2026-02-17 15:14:37.993261503 +0000 UTC m=+1344.409101969" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.018224 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.020033 4717 scope.go:117] "RemoveContainer" containerID="aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.042989 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.054466 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:14:38 crc kubenswrapper[4717]: E0217 15:14:38.054951 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerName="nova-metadata-metadata" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.054972 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerName="nova-metadata-metadata" Feb 17 15:14:38 crc kubenswrapper[4717]: E0217 15:14:38.054987 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerName="nova-metadata-log" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.054998 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerName="nova-metadata-log" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.055242 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerName="nova-metadata-log" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.055272 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" containerName="nova-metadata-metadata" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.056495 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.059330 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.059341 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.072445 4717 scope.go:117] "RemoveContainer" containerID="6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06" Feb 17 15:14:38 crc kubenswrapper[4717]: E0217 15:14:38.085433 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06\": container with ID starting with 6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06 not found: ID does not exist" containerID="6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.085483 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06"} err="failed to get container status \"6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06\": rpc error: code = NotFound desc = could not find container \"6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06\": container with ID starting with 6e3100df1935397f79d792d02fc77225c92bd3090f6ca3562379a06e540a1b06 not found: ID does not exist" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.085514 4717 scope.go:117] "RemoveContainer" containerID="aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384" Feb 17 15:14:38 crc kubenswrapper[4717]: E0217 15:14:38.086478 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384\": container with ID starting with aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384 not found: ID does not exist" containerID="aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.086601 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384"} err="failed to get container status \"aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384\": rpc error: code = NotFound desc = could not find container \"aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384\": container with ID starting with aca6ee6a8f9ef4729e3ba587c12af00836ece3badfa66725f9967ad7c56db384 not found: ID does not exist" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.096645 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.189492 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c5ad1f-898b-4643-80a5-6946068bf842-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.189592 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c5ad1f-898b-4643-80a5-6946068bf842-logs\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.189737 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c5ad1f-898b-4643-80a5-6946068bf842-config-data\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.189774 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxw4h\" (UniqueName: \"kubernetes.io/projected/f3c5ad1f-898b-4643-80a5-6946068bf842-kube-api-access-cxw4h\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.189839 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c5ad1f-898b-4643-80a5-6946068bf842-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.291584 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c5ad1f-898b-4643-80a5-6946068bf842-config-data\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.291643 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxw4h\" (UniqueName: \"kubernetes.io/projected/f3c5ad1f-898b-4643-80a5-6946068bf842-kube-api-access-cxw4h\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.291713 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c5ad1f-898b-4643-80a5-6946068bf842-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.291800 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c5ad1f-898b-4643-80a5-6946068bf842-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.291877 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c5ad1f-898b-4643-80a5-6946068bf842-logs\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.292517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c5ad1f-898b-4643-80a5-6946068bf842-logs\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.295878 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c5ad1f-898b-4643-80a5-6946068bf842-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.295931 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c5ad1f-898b-4643-80a5-6946068bf842-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.298143 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c5ad1f-898b-4643-80a5-6946068bf842-config-data\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.308273 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxw4h\" (UniqueName: \"kubernetes.io/projected/f3c5ad1f-898b-4643-80a5-6946068bf842-kube-api-access-cxw4h\") pod \"nova-metadata-0\" (UID: \"f3c5ad1f-898b-4643-80a5-6946068bf842\") " pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.387638 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.888462 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 15:14:38 crc kubenswrapper[4717]: W0217 15:14:38.888977 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3c5ad1f_898b_4643_80a5_6946068bf842.slice/crio-7a20388d2b6fd1e6b85171a0508341dbf0cd8bb0e673e3fe18c7e95fda0e6e5b WatchSource:0}: Error finding container 7a20388d2b6fd1e6b85171a0508341dbf0cd8bb0e673e3fe18c7e95fda0e6e5b: Status 404 returned error can't find the container with id 7a20388d2b6fd1e6b85171a0508341dbf0cd8bb0e673e3fe18c7e95fda0e6e5b Feb 17 15:14:38 crc kubenswrapper[4717]: I0217 15:14:38.986567 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3c5ad1f-898b-4643-80a5-6946068bf842","Type":"ContainerStarted","Data":"7a20388d2b6fd1e6b85171a0508341dbf0cd8bb0e673e3fe18c7e95fda0e6e5b"} Feb 17 15:14:39 crc kubenswrapper[4717]: I0217 15:14:39.864711 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea023995-3d37-4a3b-b22b-9903c7e21fc6" path="/var/lib/kubelet/pods/ea023995-3d37-4a3b-b22b-9903c7e21fc6/volumes" Feb 17 15:14:40 crc kubenswrapper[4717]: I0217 15:14:40.003671 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3c5ad1f-898b-4643-80a5-6946068bf842","Type":"ContainerStarted","Data":"e4a162310a83073d5330618ec6e60913cad9a5cd0522c2e447c832d05b896c71"} Feb 17 15:14:40 crc kubenswrapper[4717]: I0217 15:14:40.003741 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3c5ad1f-898b-4643-80a5-6946068bf842","Type":"ContainerStarted","Data":"fe5e9b86d76e3810a9dd4032a98facd1574ad5141aabcb56acc453bed7a38eb5"} Feb 17 15:14:40 crc kubenswrapper[4717]: I0217 15:14:40.025803 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.025784599 podStartE2EDuration="2.025784599s" podCreationTimestamp="2026-02-17 15:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:14:40.020737175 +0000 UTC m=+1346.436577661" watchObservedRunningTime="2026-02-17 15:14:40.025784599 +0000 UTC m=+1346.441625075" Feb 17 15:14:41 crc kubenswrapper[4717]: I0217 15:14:41.333237 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 15:14:43 crc kubenswrapper[4717]: I0217 15:14:43.388450 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 15:14:43 crc kubenswrapper[4717]: I0217 15:14:43.389160 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 15:14:45 crc kubenswrapper[4717]: I0217 15:14:45.663606 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 15:14:45 crc kubenswrapper[4717]: I0217 15:14:45.663930 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 15:14:46 crc kubenswrapper[4717]: I0217 15:14:46.333553 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 15:14:46 crc kubenswrapper[4717]: I0217 15:14:46.362104 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 15:14:46 crc kubenswrapper[4717]: I0217 15:14:46.681230 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a9a8f201-b55e-47e3-9d85-18d73631f9ae" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 15:14:46 crc kubenswrapper[4717]: I0217 15:14:46.681237 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a9a8f201-b55e-47e3-9d85-18d73631f9ae" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 15:14:47 crc kubenswrapper[4717]: I0217 15:14:47.115440 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 15:14:48 crc kubenswrapper[4717]: I0217 15:14:48.388064 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 15:14:48 crc kubenswrapper[4717]: I0217 15:14:48.388158 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 15:14:49 crc kubenswrapper[4717]: I0217 15:14:49.403326 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f3c5ad1f-898b-4643-80a5-6946068bf842" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 15:14:49 crc kubenswrapper[4717]: I0217 15:14:49.403896 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f3c5ad1f-898b-4643-80a5-6946068bf842" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 15:14:54 crc kubenswrapper[4717]: I0217 15:14:54.219386 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 15:14:55 crc kubenswrapper[4717]: I0217 15:14:55.653544 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 15:14:55 crc kubenswrapper[4717]: I0217 15:14:55.653944 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 15:14:55 crc kubenswrapper[4717]: I0217 15:14:55.654212 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 15:14:55 crc kubenswrapper[4717]: I0217 15:14:55.654266 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 15:14:55 crc kubenswrapper[4717]: I0217 15:14:55.660007 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 15:14:55 crc kubenswrapper[4717]: I0217 15:14:55.665352 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 15:14:58 crc kubenswrapper[4717]: I0217 15:14:58.393797 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 15:14:58 crc kubenswrapper[4717]: I0217 15:14:58.397000 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 15:14:58 crc kubenswrapper[4717]: I0217 15:14:58.398956 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 15:14:59 crc kubenswrapper[4717]: I0217 15:14:59.196874 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.153852 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d"] Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.155309 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.157481 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.157711 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.171041 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d"] Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.354890 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-secret-volume\") pod \"collect-profiles-29522355-hcz2d\" (UID: \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.354988 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-config-volume\") pod \"collect-profiles-29522355-hcz2d\" (UID: \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.355010 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwd7g\" (UniqueName: \"kubernetes.io/projected/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-kube-api-access-vwd7g\") pod \"collect-profiles-29522355-hcz2d\" (UID: \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.456634 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-secret-volume\") pod \"collect-profiles-29522355-hcz2d\" (UID: \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.456760 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-config-volume\") pod \"collect-profiles-29522355-hcz2d\" (UID: \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.456800 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwd7g\" (UniqueName: \"kubernetes.io/projected/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-kube-api-access-vwd7g\") pod \"collect-profiles-29522355-hcz2d\" (UID: \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.457746 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-config-volume\") pod \"collect-profiles-29522355-hcz2d\" (UID: \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.463807 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-secret-volume\") pod \"collect-profiles-29522355-hcz2d\" (UID: \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.477105 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwd7g\" (UniqueName: \"kubernetes.io/projected/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-kube-api-access-vwd7g\") pod \"collect-profiles-29522355-hcz2d\" (UID: \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.492892 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:00 crc kubenswrapper[4717]: I0217 15:15:00.947839 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d"] Feb 17 15:15:00 crc kubenswrapper[4717]: W0217 15:15:00.950386 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f2a2c42_47b5_4edf_96e0_b8ba5dec3ad5.slice/crio-b90d8e6b990b18bac4f4280bd4726823515db8bee0bd12a3854ea059278a763b WatchSource:0}: Error finding container b90d8e6b990b18bac4f4280bd4726823515db8bee0bd12a3854ea059278a763b: Status 404 returned error can't find the container with id b90d8e6b990b18bac4f4280bd4726823515db8bee0bd12a3854ea059278a763b Feb 17 15:15:01 crc kubenswrapper[4717]: I0217 15:15:01.212114 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" event={"ID":"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5","Type":"ContainerStarted","Data":"4d78184bce8428bd71c74d9d6a6e9fa0d18a756b619038b235ecc71ad48eb1a8"} Feb 17 15:15:01 crc kubenswrapper[4717]: I0217 15:15:01.212159 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" event={"ID":"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5","Type":"ContainerStarted","Data":"b90d8e6b990b18bac4f4280bd4726823515db8bee0bd12a3854ea059278a763b"} Feb 17 15:15:01 crc kubenswrapper[4717]: I0217 15:15:01.230695 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" podStartSLOduration=1.230670626 podStartE2EDuration="1.230670626s" podCreationTimestamp="2026-02-17 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:15:01.226193769 +0000 UTC m=+1367.642034255" watchObservedRunningTime="2026-02-17 15:15:01.230670626 +0000 UTC m=+1367.646511102" Feb 17 15:15:02 crc kubenswrapper[4717]: I0217 15:15:02.223539 4717 generic.go:334] "Generic (PLEG): container finished" podID="6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5" containerID="4d78184bce8428bd71c74d9d6a6e9fa0d18a756b619038b235ecc71ad48eb1a8" exitCode=0 Feb 17 15:15:02 crc kubenswrapper[4717]: I0217 15:15:02.223658 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" event={"ID":"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5","Type":"ContainerDied","Data":"4d78184bce8428bd71c74d9d6a6e9fa0d18a756b619038b235ecc71ad48eb1a8"} Feb 17 15:15:03 crc kubenswrapper[4717]: I0217 15:15:03.605457 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:03 crc kubenswrapper[4717]: I0217 15:15:03.719412 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-secret-volume\") pod \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\" (UID: \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\") " Feb 17 15:15:03 crc kubenswrapper[4717]: I0217 15:15:03.719493 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwd7g\" (UniqueName: \"kubernetes.io/projected/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-kube-api-access-vwd7g\") pod \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\" (UID: \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\") " Feb 17 15:15:03 crc kubenswrapper[4717]: I0217 15:15:03.719535 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-config-volume\") pod \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\" (UID: \"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5\") " Feb 17 15:15:03 crc kubenswrapper[4717]: I0217 15:15:03.720586 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-config-volume" (OuterVolumeSpecName: "config-volume") pod "6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5" (UID: "6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:03 crc kubenswrapper[4717]: I0217 15:15:03.725428 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5" (UID: "6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:15:03 crc kubenswrapper[4717]: I0217 15:15:03.725681 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-kube-api-access-vwd7g" (OuterVolumeSpecName: "kube-api-access-vwd7g") pod "6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5" (UID: "6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5"). InnerVolumeSpecName "kube-api-access-vwd7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:15:03 crc kubenswrapper[4717]: I0217 15:15:03.821987 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:03 crc kubenswrapper[4717]: I0217 15:15:03.822025 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwd7g\" (UniqueName: \"kubernetes.io/projected/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-kube-api-access-vwd7g\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:03 crc kubenswrapper[4717]: I0217 15:15:03.822035 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:04 crc kubenswrapper[4717]: I0217 15:15:04.254358 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" event={"ID":"6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5","Type":"ContainerDied","Data":"b90d8e6b990b18bac4f4280bd4726823515db8bee0bd12a3854ea059278a763b"} Feb 17 15:15:04 crc kubenswrapper[4717]: I0217 15:15:04.254397 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b90d8e6b990b18bac4f4280bd4726823515db8bee0bd12a3854ea059278a763b" Feb 17 15:15:04 crc kubenswrapper[4717]: I0217 15:15:04.254405 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d" Feb 17 15:15:06 crc kubenswrapper[4717]: I0217 15:15:06.714478 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 15:15:08 crc kubenswrapper[4717]: I0217 15:15:08.490278 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 15:15:10 crc kubenswrapper[4717]: I0217 15:15:10.985003 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="8924cebf-3c79-4978-9564-ec8869b9d79a" containerName="rabbitmq" containerID="cri-o://2c1aa0b270a00ffe95fae98442ccacfdc668667202ea12c65f80fb8f3530f8f9" gracePeriod=604796 Feb 17 15:15:12 crc kubenswrapper[4717]: I0217 15:15:12.661956 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0eb38f44-bed1-4e65-8de2-9624715baee1" containerName="rabbitmq" containerID="cri-o://2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612" gracePeriod=604796 Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.386700 4717 generic.go:334] "Generic (PLEG): container finished" podID="8924cebf-3c79-4978-9564-ec8869b9d79a" containerID="2c1aa0b270a00ffe95fae98442ccacfdc668667202ea12c65f80fb8f3530f8f9" exitCode=0 Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.386782 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8924cebf-3c79-4978-9564-ec8869b9d79a","Type":"ContainerDied","Data":"2c1aa0b270a00ffe95fae98442ccacfdc668667202ea12c65f80fb8f3530f8f9"} Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.610915 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.708747 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-confd\") pod \"8924cebf-3c79-4978-9564-ec8869b9d79a\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.708919 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-plugins-conf\") pod \"8924cebf-3c79-4978-9564-ec8869b9d79a\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.708955 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-config-data\") pod \"8924cebf-3c79-4978-9564-ec8869b9d79a\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.708996 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-server-conf\") pod \"8924cebf-3c79-4978-9564-ec8869b9d79a\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.709146 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-erlang-cookie\") pod \"8924cebf-3c79-4978-9564-ec8869b9d79a\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.709200 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-tls\") pod \"8924cebf-3c79-4978-9564-ec8869b9d79a\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.709233 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-plugins\") pod \"8924cebf-3c79-4978-9564-ec8869b9d79a\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.709324 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8924cebf-3c79-4978-9564-ec8869b9d79a-pod-info\") pod \"8924cebf-3c79-4978-9564-ec8869b9d79a\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.709361 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqxbk\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-kube-api-access-mqxbk\") pod \"8924cebf-3c79-4978-9564-ec8869b9d79a\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.709381 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"8924cebf-3c79-4978-9564-ec8869b9d79a\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.709421 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8924cebf-3c79-4978-9564-ec8869b9d79a-erlang-cookie-secret\") pod \"8924cebf-3c79-4978-9564-ec8869b9d79a\" (UID: \"8924cebf-3c79-4978-9564-ec8869b9d79a\") " Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.710173 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8924cebf-3c79-4978-9564-ec8869b9d79a" (UID: "8924cebf-3c79-4978-9564-ec8869b9d79a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.711064 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8924cebf-3c79-4978-9564-ec8869b9d79a" (UID: "8924cebf-3c79-4978-9564-ec8869b9d79a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.711702 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8924cebf-3c79-4978-9564-ec8869b9d79a" (UID: "8924cebf-3c79-4978-9564-ec8869b9d79a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.719692 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8924cebf-3c79-4978-9564-ec8869b9d79a" (UID: "8924cebf-3c79-4978-9564-ec8869b9d79a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.721147 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8924cebf-3c79-4978-9564-ec8869b9d79a-pod-info" (OuterVolumeSpecName: "pod-info") pod "8924cebf-3c79-4978-9564-ec8869b9d79a" (UID: "8924cebf-3c79-4978-9564-ec8869b9d79a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.727814 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "8924cebf-3c79-4978-9564-ec8869b9d79a" (UID: "8924cebf-3c79-4978-9564-ec8869b9d79a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.731382 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-kube-api-access-mqxbk" (OuterVolumeSpecName: "kube-api-access-mqxbk") pod "8924cebf-3c79-4978-9564-ec8869b9d79a" (UID: "8924cebf-3c79-4978-9564-ec8869b9d79a"). InnerVolumeSpecName "kube-api-access-mqxbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.742237 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8924cebf-3c79-4978-9564-ec8869b9d79a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8924cebf-3c79-4978-9564-ec8869b9d79a" (UID: "8924cebf-3c79-4978-9564-ec8869b9d79a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.811870 4717 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8924cebf-3c79-4978-9564-ec8869b9d79a-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.811904 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqxbk\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-kube-api-access-mqxbk\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.811931 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.811940 4717 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8924cebf-3c79-4978-9564-ec8869b9d79a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.811949 4717 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.811958 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.811967 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.811975 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.818666 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-config-data" (OuterVolumeSpecName: "config-data") pod "8924cebf-3c79-4978-9564-ec8869b9d79a" (UID: "8924cebf-3c79-4978-9564-ec8869b9d79a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.848447 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-server-conf" (OuterVolumeSpecName: "server-conf") pod "8924cebf-3c79-4978-9564-ec8869b9d79a" (UID: "8924cebf-3c79-4978-9564-ec8869b9d79a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.852250 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.868311 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8924cebf-3c79-4978-9564-ec8869b9d79a" (UID: "8924cebf-3c79-4978-9564-ec8869b9d79a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.914587 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8924cebf-3c79-4978-9564-ec8869b9d79a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.914635 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.914649 4717 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8924cebf-3c79-4978-9564-ec8869b9d79a-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:17 crc kubenswrapper[4717]: I0217 15:15:17.914664 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.400209 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8924cebf-3c79-4978-9564-ec8869b9d79a","Type":"ContainerDied","Data":"2ce4d391f32e4899023fcfbc3cf6e3c054c02f8462cdda8e71fa0e7a1c96ad89"} Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.400269 4717 scope.go:117] "RemoveContainer" containerID="2c1aa0b270a00ffe95fae98442ccacfdc668667202ea12c65f80fb8f3530f8f9" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.400273 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.436269 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.436677 4717 scope.go:117] "RemoveContainer" containerID="9d4afd969ca60da733acf3772d99a5fb8c4614efb5bf99795644d5b1b294843f" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.460299 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.485626 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 15:15:18 crc kubenswrapper[4717]: E0217 15:15:18.486473 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5" containerName="collect-profiles" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.486572 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5" containerName="collect-profiles" Feb 17 15:15:18 crc kubenswrapper[4717]: E0217 15:15:18.486704 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8924cebf-3c79-4978-9564-ec8869b9d79a" containerName="rabbitmq" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.486790 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8924cebf-3c79-4978-9564-ec8869b9d79a" containerName="rabbitmq" Feb 17 15:15:18 crc kubenswrapper[4717]: E0217 15:15:18.486878 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8924cebf-3c79-4978-9564-ec8869b9d79a" containerName="setup-container" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.486955 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8924cebf-3c79-4978-9564-ec8869b9d79a" containerName="setup-container" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.487274 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8924cebf-3c79-4978-9564-ec8869b9d79a" containerName="rabbitmq" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.487361 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5" containerName="collect-profiles" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.488736 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.492188 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.493231 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.493415 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.493555 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8gt4v" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.493679 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.494795 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.498818 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.520331 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.641536 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.641613 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.641638 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.641687 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.641728 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.641753 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.641793 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.641838 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.641866 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.641884 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw7pt\" (UniqueName: \"kubernetes.io/projected/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-kube-api-access-rw7pt\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.641919 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-config-data\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.743765 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-config-data\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744045 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744176 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744261 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744352 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744485 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744580 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744575 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744664 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744704 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744737 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744755 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw7pt\" (UniqueName: \"kubernetes.io/projected/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-kube-api-access-rw7pt\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744774 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.744842 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-config-data\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.745535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.745665 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.745687 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.755059 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.755927 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.760675 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.763833 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw7pt\" (UniqueName: \"kubernetes.io/projected/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-kube-api-access-rw7pt\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.766523 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.784453 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7\") " pod="openstack/rabbitmq-server-0" Feb 17 15:15:18 crc kubenswrapper[4717]: I0217 15:15:18.838781 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.353462 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: W0217 15:15:19.356455 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93a4dd31_2da7_4bf5_8ad0_6c17ec0fcba7.slice/crio-04ba4732d0de2e86f1f411953428cf3040bc5c38bb99f786bc2b73ea292a3508 WatchSource:0}: Error finding container 04ba4732d0de2e86f1f411953428cf3040bc5c38bb99f786bc2b73ea292a3508: Status 404 returned error can't find the container with id 04ba4732d0de2e86f1f411953428cf3040bc5c38bb99f786bc2b73ea292a3508 Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.359839 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.462878 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-plugins-conf\") pod \"0eb38f44-bed1-4e65-8de2-9624715baee1\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.463282 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-tls\") pod \"0eb38f44-bed1-4e65-8de2-9624715baee1\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.463312 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-erlang-cookie\") pod \"0eb38f44-bed1-4e65-8de2-9624715baee1\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.463426 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"0eb38f44-bed1-4e65-8de2-9624715baee1\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.463455 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0eb38f44-bed1-4e65-8de2-9624715baee1-erlang-cookie-secret\") pod \"0eb38f44-bed1-4e65-8de2-9624715baee1\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.463490 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-server-conf\") pod \"0eb38f44-bed1-4e65-8de2-9624715baee1\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.463519 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-config-data\") pod \"0eb38f44-bed1-4e65-8de2-9624715baee1\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.463550 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8jr7\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-kube-api-access-m8jr7\") pod \"0eb38f44-bed1-4e65-8de2-9624715baee1\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.463583 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-confd\") pod \"0eb38f44-bed1-4e65-8de2-9624715baee1\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.463618 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-plugins\") pod \"0eb38f44-bed1-4e65-8de2-9624715baee1\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.463646 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0eb38f44-bed1-4e65-8de2-9624715baee1-pod-info\") pod \"0eb38f44-bed1-4e65-8de2-9624715baee1\" (UID: \"0eb38f44-bed1-4e65-8de2-9624715baee1\") " Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.468284 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0eb38f44-bed1-4e65-8de2-9624715baee1" (UID: "0eb38f44-bed1-4e65-8de2-9624715baee1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.468414 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0eb38f44-bed1-4e65-8de2-9624715baee1" (UID: "0eb38f44-bed1-4e65-8de2-9624715baee1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.473517 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0eb38f44-bed1-4e65-8de2-9624715baee1" (UID: "0eb38f44-bed1-4e65-8de2-9624715baee1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.474598 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0eb38f44-bed1-4e65-8de2-9624715baee1-pod-info" (OuterVolumeSpecName: "pod-info") pod "0eb38f44-bed1-4e65-8de2-9624715baee1" (UID: "0eb38f44-bed1-4e65-8de2-9624715baee1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.480613 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0eb38f44-bed1-4e65-8de2-9624715baee1" (UID: "0eb38f44-bed1-4e65-8de2-9624715baee1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.481718 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7","Type":"ContainerStarted","Data":"04ba4732d0de2e86f1f411953428cf3040bc5c38bb99f786bc2b73ea292a3508"} Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.482994 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-kube-api-access-m8jr7" (OuterVolumeSpecName: "kube-api-access-m8jr7") pod "0eb38f44-bed1-4e65-8de2-9624715baee1" (UID: "0eb38f44-bed1-4e65-8de2-9624715baee1"). InnerVolumeSpecName "kube-api-access-m8jr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.484772 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "0eb38f44-bed1-4e65-8de2-9624715baee1" (UID: "0eb38f44-bed1-4e65-8de2-9624715baee1"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.485581 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eb38f44-bed1-4e65-8de2-9624715baee1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0eb38f44-bed1-4e65-8de2-9624715baee1" (UID: "0eb38f44-bed1-4e65-8de2-9624715baee1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.491033 4717 generic.go:334] "Generic (PLEG): container finished" podID="0eb38f44-bed1-4e65-8de2-9624715baee1" containerID="2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612" exitCode=0 Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.491351 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0eb38f44-bed1-4e65-8de2-9624715baee1","Type":"ContainerDied","Data":"2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612"} Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.491426 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0eb38f44-bed1-4e65-8de2-9624715baee1","Type":"ContainerDied","Data":"5a4ec674f10416720d1b003586ad4b91e335572a2782a4c2ced47e3bae754bca"} Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.491452 4717 scope.go:117] "RemoveContainer" containerID="2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.491797 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.534553 4717 scope.go:117] "RemoveContainer" containerID="9742265cb034d254215ebd27d4a013d407abf002fdfd0518586975bbc57eed7b" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.545459 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-config-data" (OuterVolumeSpecName: "config-data") pod "0eb38f44-bed1-4e65-8de2-9624715baee1" (UID: "0eb38f44-bed1-4e65-8de2-9624715baee1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.566783 4717 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.566811 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.566820 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.566848 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.566857 4717 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0eb38f44-bed1-4e65-8de2-9624715baee1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.566865 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.566874 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8jr7\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-kube-api-access-m8jr7\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.566882 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.566890 4717 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0eb38f44-bed1-4e65-8de2-9624715baee1-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.573100 4717 scope.go:117] "RemoveContainer" containerID="2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612" Feb 17 15:15:19 crc kubenswrapper[4717]: E0217 15:15:19.573575 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612\": container with ID starting with 2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612 not found: ID does not exist" containerID="2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.573620 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612"} err="failed to get container status \"2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612\": rpc error: code = NotFound desc = could not find container \"2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612\": container with ID starting with 2dbe595c86d1c80a6b6637ed3ea11c6cfd79b0867e8fce7699fd32ed6db91612 not found: ID does not exist" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.573652 4717 scope.go:117] "RemoveContainer" containerID="9742265cb034d254215ebd27d4a013d407abf002fdfd0518586975bbc57eed7b" Feb 17 15:15:19 crc kubenswrapper[4717]: E0217 15:15:19.575260 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9742265cb034d254215ebd27d4a013d407abf002fdfd0518586975bbc57eed7b\": container with ID starting with 9742265cb034d254215ebd27d4a013d407abf002fdfd0518586975bbc57eed7b not found: ID does not exist" containerID="9742265cb034d254215ebd27d4a013d407abf002fdfd0518586975bbc57eed7b" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.575291 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9742265cb034d254215ebd27d4a013d407abf002fdfd0518586975bbc57eed7b"} err="failed to get container status \"9742265cb034d254215ebd27d4a013d407abf002fdfd0518586975bbc57eed7b\": rpc error: code = NotFound desc = could not find container \"9742265cb034d254215ebd27d4a013d407abf002fdfd0518586975bbc57eed7b\": container with ID starting with 9742265cb034d254215ebd27d4a013d407abf002fdfd0518586975bbc57eed7b not found: ID does not exist" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.578505 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-server-conf" (OuterVolumeSpecName: "server-conf") pod "0eb38f44-bed1-4e65-8de2-9624715baee1" (UID: "0eb38f44-bed1-4e65-8de2-9624715baee1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.586844 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.624765 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0eb38f44-bed1-4e65-8de2-9624715baee1" (UID: "0eb38f44-bed1-4e65-8de2-9624715baee1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.669061 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.669113 4717 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0eb38f44-bed1-4e65-8de2-9624715baee1-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.669128 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0eb38f44-bed1-4e65-8de2-9624715baee1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.827678 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.837855 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.864262 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb38f44-bed1-4e65-8de2-9624715baee1" path="/var/lib/kubelet/pods/0eb38f44-bed1-4e65-8de2-9624715baee1/volumes" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.865207 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8924cebf-3c79-4978-9564-ec8869b9d79a" path="/var/lib/kubelet/pods/8924cebf-3c79-4978-9564-ec8869b9d79a/volumes" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.865812 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 15:15:19 crc kubenswrapper[4717]: E0217 15:15:19.866178 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb38f44-bed1-4e65-8de2-9624715baee1" containerName="rabbitmq" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.866194 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb38f44-bed1-4e65-8de2-9624715baee1" containerName="rabbitmq" Feb 17 15:15:19 crc kubenswrapper[4717]: E0217 15:15:19.866219 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb38f44-bed1-4e65-8de2-9624715baee1" containerName="setup-container" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.866226 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb38f44-bed1-4e65-8de2-9624715baee1" containerName="setup-container" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.866422 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb38f44-bed1-4e65-8de2-9624715baee1" containerName="rabbitmq" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.867460 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.881302 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.881499 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.881520 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-kwnlh" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.881605 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.881719 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.881797 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.881888 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.888038 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.985444 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.985513 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.985584 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.985607 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.985685 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mcvz\" (UniqueName: \"kubernetes.io/projected/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-kube-api-access-6mcvz\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.985760 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.985851 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.985880 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.985905 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.985930 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:19 crc kubenswrapper[4717]: I0217 15:15:19.985997 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.088478 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.088570 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.088613 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.088654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.088679 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.088705 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.088740 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.088774 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.088813 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.088834 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.088884 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mcvz\" (UniqueName: \"kubernetes.io/projected/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-kube-api-access-6mcvz\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.089941 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.090652 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.090773 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.090936 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.090987 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.091057 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.093634 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.094004 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.094868 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.095345 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.108786 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mcvz\" (UniqueName: \"kubernetes.io/projected/dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b-kube-api-access-6mcvz\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.131332 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.195978 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.290620 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-cmdbm"] Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.292609 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.297447 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.314985 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-cmdbm"] Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.393597 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.393651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-config\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.393679 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-dns-svc\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.393694 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.393712 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxqwf\" (UniqueName: \"kubernetes.io/projected/8e9ddb1b-7782-4cbe-beac-25d175842b8e-kube-api-access-dxqwf\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.393803 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.393825 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.420641 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-cmdbm"] Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.442347 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-dncpb"] Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.444117 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.470347 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-dncpb"] Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.495074 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.495145 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-config\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.495176 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-dns-svc\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.495194 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.495213 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxqwf\" (UniqueName: \"kubernetes.io/projected/8e9ddb1b-7782-4cbe-beac-25d175842b8e-kube-api-access-dxqwf\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.495288 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.495304 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.496000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.496112 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.496561 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.497198 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.498367 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-config\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.501513 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-dns-svc\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.521305 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxqwf\" (UniqueName: \"kubernetes.io/projected/8e9ddb1b-7782-4cbe-beac-25d175842b8e-kube-api-access-dxqwf\") pod \"dnsmasq-dns-67b789f86c-cmdbm\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.596962 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.597022 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.597127 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.597165 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgfrx\" (UniqueName: \"kubernetes.io/projected/394b7b44-48ed-406e-be48-6f7cffabfaf9-kube-api-access-mgfrx\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.597198 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.597250 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.597455 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-config\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.699110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.699232 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.699268 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgfrx\" (UniqueName: \"kubernetes.io/projected/394b7b44-48ed-406e-be48-6f7cffabfaf9-kube-api-access-mgfrx\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.699309 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.699361 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.699412 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-config\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.699504 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.700495 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.700495 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.701214 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.701378 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.701549 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-config\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.702409 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/394b7b44-48ed-406e-be48-6f7cffabfaf9-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.723797 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgfrx\" (UniqueName: \"kubernetes.io/projected/394b7b44-48ed-406e-be48-6f7cffabfaf9-kube-api-access-mgfrx\") pod \"dnsmasq-dns-cb6ffcf87-dncpb\" (UID: \"394b7b44-48ed-406e-be48-6f7cffabfaf9\") " pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:20 crc kubenswrapper[4717]: W0217 15:15:20.818134 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfbeb262_7d4a_4ec3_ac8b_0b46e0e58f2b.slice/crio-4a34a2fd55f50fdf5ae574c86e64dd6209e7585f434aa70162e321a510b3922c WatchSource:0}: Error finding container 4a34a2fd55f50fdf5ae574c86e64dd6209e7585f434aa70162e321a510b3922c: Status 404 returned error can't find the container with id 4a34a2fd55f50fdf5ae574c86e64dd6209e7585f434aa70162e321a510b3922c Feb 17 15:15:20 crc kubenswrapper[4717]: I0217 15:15:20.818953 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 15:15:21 crc kubenswrapper[4717]: I0217 15:15:21.068179 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:21 crc kubenswrapper[4717]: I0217 15:15:21.083604 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:21 crc kubenswrapper[4717]: I0217 15:15:21.534609 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b","Type":"ContainerStarted","Data":"4a34a2fd55f50fdf5ae574c86e64dd6209e7585f434aa70162e321a510b3922c"} Feb 17 15:15:21 crc kubenswrapper[4717]: I0217 15:15:21.537090 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7","Type":"ContainerStarted","Data":"bb6a4e263dc5b81a7a61668b497f7ce73802fdcd659e02836fe9888c1f6e0d8a"} Feb 17 15:15:21 crc kubenswrapper[4717]: I0217 15:15:21.560366 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-dncpb"] Feb 17 15:15:21 crc kubenswrapper[4717]: W0217 15:15:21.572115 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod394b7b44_48ed_406e_be48_6f7cffabfaf9.slice/crio-dc011de5347955ee1979dcc711acc7bb11d72cb5e753350f0e3513a339794743 WatchSource:0}: Error finding container dc011de5347955ee1979dcc711acc7bb11d72cb5e753350f0e3513a339794743: Status 404 returned error can't find the container with id dc011de5347955ee1979dcc711acc7bb11d72cb5e753350f0e3513a339794743 Feb 17 15:15:21 crc kubenswrapper[4717]: I0217 15:15:21.573729 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-cmdbm"] Feb 17 15:15:21 crc kubenswrapper[4717]: W0217 15:15:21.585821 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e9ddb1b_7782_4cbe_beac_25d175842b8e.slice/crio-54b3600abadd6a36c265a11159fe25befa4c02e90e152313cde8eaa38e4a1533 WatchSource:0}: Error finding container 54b3600abadd6a36c265a11159fe25befa4c02e90e152313cde8eaa38e4a1533: Status 404 returned error can't find the container with id 54b3600abadd6a36c265a11159fe25befa4c02e90e152313cde8eaa38e4a1533 Feb 17 15:15:22 crc kubenswrapper[4717]: I0217 15:15:22.552199 4717 generic.go:334] "Generic (PLEG): container finished" podID="394b7b44-48ed-406e-be48-6f7cffabfaf9" containerID="7abecfaaa212b0d343e2f47c33579715c0809cee42fc8620afd019b04991bc17" exitCode=0 Feb 17 15:15:22 crc kubenswrapper[4717]: I0217 15:15:22.552277 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" event={"ID":"394b7b44-48ed-406e-be48-6f7cffabfaf9","Type":"ContainerDied","Data":"7abecfaaa212b0d343e2f47c33579715c0809cee42fc8620afd019b04991bc17"} Feb 17 15:15:22 crc kubenswrapper[4717]: I0217 15:15:22.553336 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" event={"ID":"394b7b44-48ed-406e-be48-6f7cffabfaf9","Type":"ContainerStarted","Data":"dc011de5347955ee1979dcc711acc7bb11d72cb5e753350f0e3513a339794743"} Feb 17 15:15:22 crc kubenswrapper[4717]: I0217 15:15:22.556755 4717 generic.go:334] "Generic (PLEG): container finished" podID="8e9ddb1b-7782-4cbe-beac-25d175842b8e" containerID="032bc54212393d828704192ef0bcc3518ae9c8b937e4be487c5683126779cffb" exitCode=0 Feb 17 15:15:22 crc kubenswrapper[4717]: I0217 15:15:22.556820 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" event={"ID":"8e9ddb1b-7782-4cbe-beac-25d175842b8e","Type":"ContainerDied","Data":"032bc54212393d828704192ef0bcc3518ae9c8b937e4be487c5683126779cffb"} Feb 17 15:15:22 crc kubenswrapper[4717]: I0217 15:15:22.556885 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" event={"ID":"8e9ddb1b-7782-4cbe-beac-25d175842b8e","Type":"ContainerStarted","Data":"54b3600abadd6a36c265a11159fe25befa4c02e90e152313cde8eaa38e4a1533"} Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.005820 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.063862 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-dns-swift-storage-0\") pod \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.063988 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-openstack-edpm-ipam\") pod \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.064051 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-dns-svc\") pod \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.064210 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-ovsdbserver-sb\") pod \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.064252 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-ovsdbserver-nb\") pod \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.064314 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxqwf\" (UniqueName: \"kubernetes.io/projected/8e9ddb1b-7782-4cbe-beac-25d175842b8e-kube-api-access-dxqwf\") pod \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.064345 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-config\") pod \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\" (UID: \"8e9ddb1b-7782-4cbe-beac-25d175842b8e\") " Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.070338 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e9ddb1b-7782-4cbe-beac-25d175842b8e-kube-api-access-dxqwf" (OuterVolumeSpecName: "kube-api-access-dxqwf") pod "8e9ddb1b-7782-4cbe-beac-25d175842b8e" (UID: "8e9ddb1b-7782-4cbe-beac-25d175842b8e"). InnerVolumeSpecName "kube-api-access-dxqwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.096968 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-config" (OuterVolumeSpecName: "config") pod "8e9ddb1b-7782-4cbe-beac-25d175842b8e" (UID: "8e9ddb1b-7782-4cbe-beac-25d175842b8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.097253 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "8e9ddb1b-7782-4cbe-beac-25d175842b8e" (UID: "8e9ddb1b-7782-4cbe-beac-25d175842b8e"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.097270 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8e9ddb1b-7782-4cbe-beac-25d175842b8e" (UID: "8e9ddb1b-7782-4cbe-beac-25d175842b8e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.097430 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e9ddb1b-7782-4cbe-beac-25d175842b8e" (UID: "8e9ddb1b-7782-4cbe-beac-25d175842b8e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.101961 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e9ddb1b-7782-4cbe-beac-25d175842b8e" (UID: "8e9ddb1b-7782-4cbe-beac-25d175842b8e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.107362 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e9ddb1b-7782-4cbe-beac-25d175842b8e" (UID: "8e9ddb1b-7782-4cbe-beac-25d175842b8e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.166978 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.167041 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.167060 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxqwf\" (UniqueName: \"kubernetes.io/projected/8e9ddb1b-7782-4cbe-beac-25d175842b8e-kube-api-access-dxqwf\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.167104 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.167125 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.167143 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.167160 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e9ddb1b-7782-4cbe-beac-25d175842b8e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.573876 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" event={"ID":"394b7b44-48ed-406e-be48-6f7cffabfaf9","Type":"ContainerStarted","Data":"4717f4b7a555459d029e618f90321136d0fd0fa65ad3e97189aded0299cac249"} Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.574910 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.579602 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" event={"ID":"8e9ddb1b-7782-4cbe-beac-25d175842b8e","Type":"ContainerDied","Data":"54b3600abadd6a36c265a11159fe25befa4c02e90e152313cde8eaa38e4a1533"} Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.579648 4717 scope.go:117] "RemoveContainer" containerID="032bc54212393d828704192ef0bcc3518ae9c8b937e4be487c5683126779cffb" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.579765 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-cmdbm" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.590717 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b","Type":"ContainerStarted","Data":"62ac1333afe0b4f25c393a01c7c405282d35ed82702c742e67823e2c7ecad3f1"} Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.624144 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" podStartSLOduration=3.6241209420000002 podStartE2EDuration="3.624120942s" podCreationTimestamp="2026-02-17 15:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:15:23.623749691 +0000 UTC m=+1390.039590197" watchObservedRunningTime="2026-02-17 15:15:23.624120942 +0000 UTC m=+1390.039961458" Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.733251 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-cmdbm"] Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.739639 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-cmdbm"] Feb 17 15:15:23 crc kubenswrapper[4717]: I0217 15:15:23.859156 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e9ddb1b-7782-4cbe-beac-25d175842b8e" path="/var/lib/kubelet/pods/8e9ddb1b-7782-4cbe-beac-25d175842b8e/volumes" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.086633 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-dncpb" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.152056 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-g7ft8"] Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.152295 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" podUID="6f5488fe-614a-4ff6-bb53-f1578e913bdd" containerName="dnsmasq-dns" containerID="cri-o://6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed" gracePeriod=10 Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.689915 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.711375 4717 generic.go:334] "Generic (PLEG): container finished" podID="6f5488fe-614a-4ff6-bb53-f1578e913bdd" containerID="6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed" exitCode=0 Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.711420 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" event={"ID":"6f5488fe-614a-4ff6-bb53-f1578e913bdd","Type":"ContainerDied","Data":"6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed"} Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.711437 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.711456 4717 scope.go:117] "RemoveContainer" containerID="6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.711446 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-g7ft8" event={"ID":"6f5488fe-614a-4ff6-bb53-f1578e913bdd","Type":"ContainerDied","Data":"deeba058bb9c72329c64137b8cb827cfdbef3393433a88ca0b3e29a624f81b23"} Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.744362 4717 scope.go:117] "RemoveContainer" containerID="9fbc9a2ed0edfe50b3b9f4f79ef6cdfaf7dfa03de1cc7d9b28544060facd2819" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.768667 4717 scope.go:117] "RemoveContainer" containerID="6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed" Feb 17 15:15:31 crc kubenswrapper[4717]: E0217 15:15:31.768984 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed\": container with ID starting with 6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed not found: ID does not exist" containerID="6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.769022 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed"} err="failed to get container status \"6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed\": rpc error: code = NotFound desc = could not find container \"6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed\": container with ID starting with 6e8525dc9d25f75f286dbee2506f2bc15d716ba09f0d7f73d9c338590dcd6bed not found: ID does not exist" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.769042 4717 scope.go:117] "RemoveContainer" containerID="9fbc9a2ed0edfe50b3b9f4f79ef6cdfaf7dfa03de1cc7d9b28544060facd2819" Feb 17 15:15:31 crc kubenswrapper[4717]: E0217 15:15:31.769285 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fbc9a2ed0edfe50b3b9f4f79ef6cdfaf7dfa03de1cc7d9b28544060facd2819\": container with ID starting with 9fbc9a2ed0edfe50b3b9f4f79ef6cdfaf7dfa03de1cc7d9b28544060facd2819 not found: ID does not exist" containerID="9fbc9a2ed0edfe50b3b9f4f79ef6cdfaf7dfa03de1cc7d9b28544060facd2819" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.769312 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbc9a2ed0edfe50b3b9f4f79ef6cdfaf7dfa03de1cc7d9b28544060facd2819"} err="failed to get container status \"9fbc9a2ed0edfe50b3b9f4f79ef6cdfaf7dfa03de1cc7d9b28544060facd2819\": rpc error: code = NotFound desc = could not find container \"9fbc9a2ed0edfe50b3b9f4f79ef6cdfaf7dfa03de1cc7d9b28544060facd2819\": container with ID starting with 9fbc9a2ed0edfe50b3b9f4f79ef6cdfaf7dfa03de1cc7d9b28544060facd2819 not found: ID does not exist" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.790252 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-ovsdbserver-nb\") pod \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.790383 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwm2r\" (UniqueName: \"kubernetes.io/projected/6f5488fe-614a-4ff6-bb53-f1578e913bdd-kube-api-access-nwm2r\") pod \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.790478 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-ovsdbserver-sb\") pod \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.790726 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-config\") pod \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.790763 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-dns-svc\") pod \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.790829 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-dns-swift-storage-0\") pod \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\" (UID: \"6f5488fe-614a-4ff6-bb53-f1578e913bdd\") " Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.804330 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f5488fe-614a-4ff6-bb53-f1578e913bdd-kube-api-access-nwm2r" (OuterVolumeSpecName: "kube-api-access-nwm2r") pod "6f5488fe-614a-4ff6-bb53-f1578e913bdd" (UID: "6f5488fe-614a-4ff6-bb53-f1578e913bdd"). InnerVolumeSpecName "kube-api-access-nwm2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.840765 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6f5488fe-614a-4ff6-bb53-f1578e913bdd" (UID: "6f5488fe-614a-4ff6-bb53-f1578e913bdd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.843758 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6f5488fe-614a-4ff6-bb53-f1578e913bdd" (UID: "6f5488fe-614a-4ff6-bb53-f1578e913bdd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.845009 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6f5488fe-614a-4ff6-bb53-f1578e913bdd" (UID: "6f5488fe-614a-4ff6-bb53-f1578e913bdd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.850448 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6f5488fe-614a-4ff6-bb53-f1578e913bdd" (UID: "6f5488fe-614a-4ff6-bb53-f1578e913bdd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.851905 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-config" (OuterVolumeSpecName: "config") pod "6f5488fe-614a-4ff6-bb53-f1578e913bdd" (UID: "6f5488fe-614a-4ff6-bb53-f1578e913bdd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.894134 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.894192 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.894214 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwm2r\" (UniqueName: \"kubernetes.io/projected/6f5488fe-614a-4ff6-bb53-f1578e913bdd-kube-api-access-nwm2r\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.894238 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.894257 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:31 crc kubenswrapper[4717]: I0217 15:15:31.894276 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f5488fe-614a-4ff6-bb53-f1578e913bdd-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 15:15:32 crc kubenswrapper[4717]: I0217 15:15:32.072466 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-g7ft8"] Feb 17 15:15:32 crc kubenswrapper[4717]: I0217 15:15:32.083730 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-g7ft8"] Feb 17 15:15:33 crc kubenswrapper[4717]: I0217 15:15:33.868325 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f5488fe-614a-4ff6-bb53-f1578e913bdd" path="/var/lib/kubelet/pods/6f5488fe-614a-4ff6-bb53-f1578e913bdd/volumes" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.568713 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm"] Feb 17 15:15:39 crc kubenswrapper[4717]: E0217 15:15:39.569378 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f5488fe-614a-4ff6-bb53-f1578e913bdd" containerName="dnsmasq-dns" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.569394 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5488fe-614a-4ff6-bb53-f1578e913bdd" containerName="dnsmasq-dns" Feb 17 15:15:39 crc kubenswrapper[4717]: E0217 15:15:39.569421 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9ddb1b-7782-4cbe-beac-25d175842b8e" containerName="init" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.569430 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9ddb1b-7782-4cbe-beac-25d175842b8e" containerName="init" Feb 17 15:15:39 crc kubenswrapper[4717]: E0217 15:15:39.569446 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f5488fe-614a-4ff6-bb53-f1578e913bdd" containerName="init" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.569454 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5488fe-614a-4ff6-bb53-f1578e913bdd" containerName="init" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.569679 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f5488fe-614a-4ff6-bb53-f1578e913bdd" containerName="dnsmasq-dns" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.569693 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e9ddb1b-7782-4cbe-beac-25d175842b8e" containerName="init" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.570589 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.575743 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.575822 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.575747 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.579565 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.592732 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.592778 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.592827 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.592915 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ckqt\" (UniqueName: \"kubernetes.io/projected/de72738c-0584-4539-9bd9-92382a0f5538-kube-api-access-8ckqt\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.595000 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm"] Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.696037 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.696155 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.696254 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.696457 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ckqt\" (UniqueName: \"kubernetes.io/projected/de72738c-0584-4539-9bd9-92382a0f5538-kube-api-access-8ckqt\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.703957 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.705304 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.712625 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.719482 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ckqt\" (UniqueName: \"kubernetes.io/projected/de72738c-0584-4539-9bd9-92382a0f5538-kube-api-access-8ckqt\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:39 crc kubenswrapper[4717]: I0217 15:15:39.895963 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:15:40 crc kubenswrapper[4717]: I0217 15:15:40.500042 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm"] Feb 17 15:15:40 crc kubenswrapper[4717]: W0217 15:15:40.505247 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde72738c_0584_4539_9bd9_92382a0f5538.slice/crio-db3ee3f5bc463a7b894899ff73640a0cb4f4b10c54740a19b270d0c77249a864 WatchSource:0}: Error finding container db3ee3f5bc463a7b894899ff73640a0cb4f4b10c54740a19b270d0c77249a864: Status 404 returned error can't find the container with id db3ee3f5bc463a7b894899ff73640a0cb4f4b10c54740a19b270d0c77249a864 Feb 17 15:15:40 crc kubenswrapper[4717]: I0217 15:15:40.822690 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" event={"ID":"de72738c-0584-4539-9bd9-92382a0f5538","Type":"ContainerStarted","Data":"db3ee3f5bc463a7b894899ff73640a0cb4f4b10c54740a19b270d0c77249a864"} Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.542971 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hqcp2"] Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.545589 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.565514 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399dcb0c-c150-4561-8220-475314c6b22f-utilities\") pod \"redhat-operators-hqcp2\" (UID: \"399dcb0c-c150-4561-8220-475314c6b22f\") " pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.565883 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399dcb0c-c150-4561-8220-475314c6b22f-catalog-content\") pod \"redhat-operators-hqcp2\" (UID: \"399dcb0c-c150-4561-8220-475314c6b22f\") " pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.565921 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhzj4\" (UniqueName: \"kubernetes.io/projected/399dcb0c-c150-4561-8220-475314c6b22f-kube-api-access-hhzj4\") pod \"redhat-operators-hqcp2\" (UID: \"399dcb0c-c150-4561-8220-475314c6b22f\") " pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.591845 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqcp2"] Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.667850 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399dcb0c-c150-4561-8220-475314c6b22f-utilities\") pod \"redhat-operators-hqcp2\" (UID: \"399dcb0c-c150-4561-8220-475314c6b22f\") " pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.667920 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399dcb0c-c150-4561-8220-475314c6b22f-catalog-content\") pod \"redhat-operators-hqcp2\" (UID: \"399dcb0c-c150-4561-8220-475314c6b22f\") " pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.667961 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhzj4\" (UniqueName: \"kubernetes.io/projected/399dcb0c-c150-4561-8220-475314c6b22f-kube-api-access-hhzj4\") pod \"redhat-operators-hqcp2\" (UID: \"399dcb0c-c150-4561-8220-475314c6b22f\") " pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.668508 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399dcb0c-c150-4561-8220-475314c6b22f-utilities\") pod \"redhat-operators-hqcp2\" (UID: \"399dcb0c-c150-4561-8220-475314c6b22f\") " pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.668583 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399dcb0c-c150-4561-8220-475314c6b22f-catalog-content\") pod \"redhat-operators-hqcp2\" (UID: \"399dcb0c-c150-4561-8220-475314c6b22f\") " pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.687744 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhzj4\" (UniqueName: \"kubernetes.io/projected/399dcb0c-c150-4561-8220-475314c6b22f-kube-api-access-hhzj4\") pod \"redhat-operators-hqcp2\" (UID: \"399dcb0c-c150-4561-8220-475314c6b22f\") " pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:15:42 crc kubenswrapper[4717]: I0217 15:15:42.882742 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:15:43 crc kubenswrapper[4717]: I0217 15:15:43.362052 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqcp2"] Feb 17 15:15:43 crc kubenswrapper[4717]: W0217 15:15:43.366967 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod399dcb0c_c150_4561_8220_475314c6b22f.slice/crio-6ea4f09e115a08dcd318b85f0fbd6fed7e5c224df9f57ec23c390897d7455d13 WatchSource:0}: Error finding container 6ea4f09e115a08dcd318b85f0fbd6fed7e5c224df9f57ec23c390897d7455d13: Status 404 returned error can't find the container with id 6ea4f09e115a08dcd318b85f0fbd6fed7e5c224df9f57ec23c390897d7455d13 Feb 17 15:15:43 crc kubenswrapper[4717]: I0217 15:15:43.852612 4717 generic.go:334] "Generic (PLEG): container finished" podID="399dcb0c-c150-4561-8220-475314c6b22f" containerID="1d4d806e135abbe76f8170c0c685fb1e8020ee8a3d6d45381351b21a9e50a835" exitCode=0 Feb 17 15:15:43 crc kubenswrapper[4717]: I0217 15:15:43.860488 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcp2" event={"ID":"399dcb0c-c150-4561-8220-475314c6b22f","Type":"ContainerDied","Data":"1d4d806e135abbe76f8170c0c685fb1e8020ee8a3d6d45381351b21a9e50a835"} Feb 17 15:15:43 crc kubenswrapper[4717]: I0217 15:15:43.860517 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcp2" event={"ID":"399dcb0c-c150-4561-8220-475314c6b22f","Type":"ContainerStarted","Data":"6ea4f09e115a08dcd318b85f0fbd6fed7e5c224df9f57ec23c390897d7455d13"} Feb 17 15:15:50 crc kubenswrapper[4717]: I0217 15:15:50.005982 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcp2" event={"ID":"399dcb0c-c150-4561-8220-475314c6b22f","Type":"ContainerStarted","Data":"58fd78466cb7434ba5e35ee29241c589671c769da527159f3af0000099320a77"} Feb 17 15:15:50 crc kubenswrapper[4717]: I0217 15:15:50.010310 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" event={"ID":"de72738c-0584-4539-9bd9-92382a0f5538","Type":"ContainerStarted","Data":"d36285bab227b04552e3e0789492c349a265bf27becd9be94f0210e74c0a51d7"} Feb 17 15:15:50 crc kubenswrapper[4717]: I0217 15:15:50.056549 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" podStartSLOduration=2.186906051 podStartE2EDuration="11.056529192s" podCreationTimestamp="2026-02-17 15:15:39 +0000 UTC" firstStartedPulling="2026-02-17 15:15:40.507697695 +0000 UTC m=+1406.923538181" lastFinishedPulling="2026-02-17 15:15:49.377320846 +0000 UTC m=+1415.793161322" observedRunningTime="2026-02-17 15:15:50.050804119 +0000 UTC m=+1416.466644665" watchObservedRunningTime="2026-02-17 15:15:50.056529192 +0000 UTC m=+1416.472369658" Feb 17 15:15:50 crc kubenswrapper[4717]: I0217 15:15:50.808151 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:15:50 crc kubenswrapper[4717]: I0217 15:15:50.808645 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:15:51 crc kubenswrapper[4717]: I0217 15:15:51.024329 4717 generic.go:334] "Generic (PLEG): container finished" podID="399dcb0c-c150-4561-8220-475314c6b22f" containerID="58fd78466cb7434ba5e35ee29241c589671c769da527159f3af0000099320a77" exitCode=0 Feb 17 15:15:51 crc kubenswrapper[4717]: I0217 15:15:51.024534 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcp2" event={"ID":"399dcb0c-c150-4561-8220-475314c6b22f","Type":"ContainerDied","Data":"58fd78466cb7434ba5e35ee29241c589671c769da527159f3af0000099320a77"} Feb 17 15:15:53 crc kubenswrapper[4717]: I0217 15:15:53.053988 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcp2" event={"ID":"399dcb0c-c150-4561-8220-475314c6b22f","Type":"ContainerStarted","Data":"68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d"} Feb 17 15:15:53 crc kubenswrapper[4717]: I0217 15:15:53.081706 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hqcp2" podStartSLOduration=2.3170194 podStartE2EDuration="11.081686543s" podCreationTimestamp="2026-02-17 15:15:42 +0000 UTC" firstStartedPulling="2026-02-17 15:15:43.855175283 +0000 UTC m=+1410.271015769" lastFinishedPulling="2026-02-17 15:15:52.619842436 +0000 UTC m=+1419.035682912" observedRunningTime="2026-02-17 15:15:53.073959234 +0000 UTC m=+1419.489799740" watchObservedRunningTime="2026-02-17 15:15:53.081686543 +0000 UTC m=+1419.497527029" Feb 17 15:15:54 crc kubenswrapper[4717]: I0217 15:15:54.066662 4717 generic.go:334] "Generic (PLEG): container finished" podID="93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7" containerID="bb6a4e263dc5b81a7a61668b497f7ce73802fdcd659e02836fe9888c1f6e0d8a" exitCode=0 Feb 17 15:15:54 crc kubenswrapper[4717]: I0217 15:15:54.066756 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7","Type":"ContainerDied","Data":"bb6a4e263dc5b81a7a61668b497f7ce73802fdcd659e02836fe9888c1f6e0d8a"} Feb 17 15:15:55 crc kubenswrapper[4717]: I0217 15:15:55.076740 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7","Type":"ContainerStarted","Data":"8f98dfddcb1cdcfd2bda7adfa0e1b156cd9c99fb9f2210b50f2c29a03736d3eb"} Feb 17 15:15:55 crc kubenswrapper[4717]: I0217 15:15:55.078224 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 15:15:55 crc kubenswrapper[4717]: I0217 15:15:55.103748 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.103733368 podStartE2EDuration="37.103733368s" podCreationTimestamp="2026-02-17 15:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:15:55.098582361 +0000 UTC m=+1421.514422837" watchObservedRunningTime="2026-02-17 15:15:55.103733368 +0000 UTC m=+1421.519573844" Feb 17 15:15:56 crc kubenswrapper[4717]: I0217 15:15:56.089201 4717 generic.go:334] "Generic (PLEG): container finished" podID="dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b" containerID="62ac1333afe0b4f25c393a01c7c405282d35ed82702c742e67823e2c7ecad3f1" exitCode=0 Feb 17 15:15:56 crc kubenswrapper[4717]: I0217 15:15:56.090801 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b","Type":"ContainerDied","Data":"62ac1333afe0b4f25c393a01c7c405282d35ed82702c742e67823e2c7ecad3f1"} Feb 17 15:15:57 crc kubenswrapper[4717]: I0217 15:15:57.103147 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b","Type":"ContainerStarted","Data":"f22e05e5f9ff16093866649cc33c81b553350931b4232bf528e11d600bc6c265"} Feb 17 15:15:57 crc kubenswrapper[4717]: I0217 15:15:57.103791 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:15:57 crc kubenswrapper[4717]: I0217 15:15:57.163706 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.163679177 podStartE2EDuration="38.163679177s" podCreationTimestamp="2026-02-17 15:15:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:15:57.136275049 +0000 UTC m=+1423.552115535" watchObservedRunningTime="2026-02-17 15:15:57.163679177 +0000 UTC m=+1423.579519653" Feb 17 15:15:58 crc kubenswrapper[4717]: I0217 15:15:58.750870 4717 scope.go:117] "RemoveContainer" containerID="361f4b86fb4b7008224e7c01e46e332e24b73a6c0b4fe530d41f2b094ee82d3a" Feb 17 15:15:58 crc kubenswrapper[4717]: I0217 15:15:58.774862 4717 scope.go:117] "RemoveContainer" containerID="05df5560daef0bb12f9e253b8f576459cf9a777b31cf7822e1610a254cb7ae4a" Feb 17 15:15:58 crc kubenswrapper[4717]: I0217 15:15:58.831058 4717 scope.go:117] "RemoveContainer" containerID="3d77d2b1625f12cc883922b78072847bd772f3918f9612764f90656bec71082b" Feb 17 15:16:02 crc kubenswrapper[4717]: I0217 15:16:02.152544 4717 generic.go:334] "Generic (PLEG): container finished" podID="de72738c-0584-4539-9bd9-92382a0f5538" containerID="d36285bab227b04552e3e0789492c349a265bf27becd9be94f0210e74c0a51d7" exitCode=0 Feb 17 15:16:02 crc kubenswrapper[4717]: I0217 15:16:02.152639 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" event={"ID":"de72738c-0584-4539-9bd9-92382a0f5538","Type":"ContainerDied","Data":"d36285bab227b04552e3e0789492c349a265bf27becd9be94f0210e74c0a51d7"} Feb 17 15:16:02 crc kubenswrapper[4717]: I0217 15:16:02.883424 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:16:02 crc kubenswrapper[4717]: I0217 15:16:02.883483 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.609401 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.733005 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ckqt\" (UniqueName: \"kubernetes.io/projected/de72738c-0584-4539-9bd9-92382a0f5538-kube-api-access-8ckqt\") pod \"de72738c-0584-4539-9bd9-92382a0f5538\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.733128 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-repo-setup-combined-ca-bundle\") pod \"de72738c-0584-4539-9bd9-92382a0f5538\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.733321 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-inventory\") pod \"de72738c-0584-4539-9bd9-92382a0f5538\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.733394 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-ssh-key-openstack-edpm-ipam\") pod \"de72738c-0584-4539-9bd9-92382a0f5538\" (UID: \"de72738c-0584-4539-9bd9-92382a0f5538\") " Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.762450 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de72738c-0584-4539-9bd9-92382a0f5538-kube-api-access-8ckqt" (OuterVolumeSpecName: "kube-api-access-8ckqt") pod "de72738c-0584-4539-9bd9-92382a0f5538" (UID: "de72738c-0584-4539-9bd9-92382a0f5538"). InnerVolumeSpecName "kube-api-access-8ckqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.762576 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "de72738c-0584-4539-9bd9-92382a0f5538" (UID: "de72738c-0584-4539-9bd9-92382a0f5538"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.769453 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "de72738c-0584-4539-9bd9-92382a0f5538" (UID: "de72738c-0584-4539-9bd9-92382a0f5538"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.820801 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-inventory" (OuterVolumeSpecName: "inventory") pod "de72738c-0584-4539-9bd9-92382a0f5538" (UID: "de72738c-0584-4539-9bd9-92382a0f5538"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.839696 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.839740 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.839759 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ckqt\" (UniqueName: \"kubernetes.io/projected/de72738c-0584-4539-9bd9-92382a0f5538-kube-api-access-8ckqt\") on node \"crc\" DevicePath \"\"" Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.839770 4717 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de72738c-0584-4539-9bd9-92382a0f5538-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:16:03 crc kubenswrapper[4717]: I0217 15:16:03.950060 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hqcp2" podUID="399dcb0c-c150-4561-8220-475314c6b22f" containerName="registry-server" probeResult="failure" output=< Feb 17 15:16:03 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 17 15:16:03 crc kubenswrapper[4717]: > Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.173771 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" event={"ID":"de72738c-0584-4539-9bd9-92382a0f5538","Type":"ContainerDied","Data":"db3ee3f5bc463a7b894899ff73640a0cb4f4b10c54740a19b270d0c77249a864"} Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.174160 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db3ee3f5bc463a7b894899ff73640a0cb4f4b10c54740a19b270d0c77249a864" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.174166 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.326832 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx"] Feb 17 15:16:04 crc kubenswrapper[4717]: E0217 15:16:04.327337 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de72738c-0584-4539-9bd9-92382a0f5538" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.327366 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="de72738c-0584-4539-9bd9-92382a0f5538" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.327629 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="de72738c-0584-4539-9bd9-92382a0f5538" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.328602 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.334308 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.334517 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.334692 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.334826 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.337464 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx"] Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.459227 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgd4p\" (UniqueName: \"kubernetes.io/projected/2047a6dc-1a1c-4b26-bee6-c16d812a99df-kube-api-access-wgd4p\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-678sx\" (UID: \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.459330 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2047a6dc-1a1c-4b26-bee6-c16d812a99df-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-678sx\" (UID: \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.459373 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2047a6dc-1a1c-4b26-bee6-c16d812a99df-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-678sx\" (UID: \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.560917 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2047a6dc-1a1c-4b26-bee6-c16d812a99df-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-678sx\" (UID: \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.560994 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2047a6dc-1a1c-4b26-bee6-c16d812a99df-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-678sx\" (UID: \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.561108 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgd4p\" (UniqueName: \"kubernetes.io/projected/2047a6dc-1a1c-4b26-bee6-c16d812a99df-kube-api-access-wgd4p\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-678sx\" (UID: \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.566362 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2047a6dc-1a1c-4b26-bee6-c16d812a99df-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-678sx\" (UID: \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.570613 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2047a6dc-1a1c-4b26-bee6-c16d812a99df-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-678sx\" (UID: \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.577377 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgd4p\" (UniqueName: \"kubernetes.io/projected/2047a6dc-1a1c-4b26-bee6-c16d812a99df-kube-api-access-wgd4p\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-678sx\" (UID: \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:04 crc kubenswrapper[4717]: I0217 15:16:04.661739 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:05 crc kubenswrapper[4717]: I0217 15:16:05.227580 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx"] Feb 17 15:16:06 crc kubenswrapper[4717]: I0217 15:16:06.207208 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" event={"ID":"2047a6dc-1a1c-4b26-bee6-c16d812a99df","Type":"ContainerStarted","Data":"ec1ae10e5050e456e297f58271fafaf4ef9e66cc3cb259c174c081ceec2ed88e"} Feb 17 15:16:06 crc kubenswrapper[4717]: I0217 15:16:06.211603 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" event={"ID":"2047a6dc-1a1c-4b26-bee6-c16d812a99df","Type":"ContainerStarted","Data":"8ff50de939d748f33b1c707309b6ac1bab50935efb734cffa6a62eaca29d453e"} Feb 17 15:16:06 crc kubenswrapper[4717]: I0217 15:16:06.252650 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" podStartSLOduration=1.794965256 podStartE2EDuration="2.252627654s" podCreationTimestamp="2026-02-17 15:16:04 +0000 UTC" firstStartedPulling="2026-02-17 15:16:05.231896526 +0000 UTC m=+1431.647737002" lastFinishedPulling="2026-02-17 15:16:05.689558914 +0000 UTC m=+1432.105399400" observedRunningTime="2026-02-17 15:16:06.235479997 +0000 UTC m=+1432.651320503" watchObservedRunningTime="2026-02-17 15:16:06.252627654 +0000 UTC m=+1432.668468140" Feb 17 15:16:08 crc kubenswrapper[4717]: I0217 15:16:08.843332 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 15:16:09 crc kubenswrapper[4717]: I0217 15:16:09.238736 4717 generic.go:334] "Generic (PLEG): container finished" podID="2047a6dc-1a1c-4b26-bee6-c16d812a99df" containerID="ec1ae10e5050e456e297f58271fafaf4ef9e66cc3cb259c174c081ceec2ed88e" exitCode=0 Feb 17 15:16:09 crc kubenswrapper[4717]: I0217 15:16:09.238790 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" event={"ID":"2047a6dc-1a1c-4b26-bee6-c16d812a99df","Type":"ContainerDied","Data":"ec1ae10e5050e456e297f58271fafaf4ef9e66cc3cb259c174c081ceec2ed88e"} Feb 17 15:16:10 crc kubenswrapper[4717]: I0217 15:16:10.198380 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 15:16:10 crc kubenswrapper[4717]: I0217 15:16:10.757014 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:10 crc kubenswrapper[4717]: I0217 15:16:10.887137 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2047a6dc-1a1c-4b26-bee6-c16d812a99df-inventory\") pod \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\" (UID: \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\") " Feb 17 15:16:10 crc kubenswrapper[4717]: I0217 15:16:10.887237 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2047a6dc-1a1c-4b26-bee6-c16d812a99df-ssh-key-openstack-edpm-ipam\") pod \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\" (UID: \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\") " Feb 17 15:16:10 crc kubenswrapper[4717]: I0217 15:16:10.887277 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgd4p\" (UniqueName: \"kubernetes.io/projected/2047a6dc-1a1c-4b26-bee6-c16d812a99df-kube-api-access-wgd4p\") pod \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\" (UID: \"2047a6dc-1a1c-4b26-bee6-c16d812a99df\") " Feb 17 15:16:10 crc kubenswrapper[4717]: I0217 15:16:10.892596 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2047a6dc-1a1c-4b26-bee6-c16d812a99df-kube-api-access-wgd4p" (OuterVolumeSpecName: "kube-api-access-wgd4p") pod "2047a6dc-1a1c-4b26-bee6-c16d812a99df" (UID: "2047a6dc-1a1c-4b26-bee6-c16d812a99df"). InnerVolumeSpecName "kube-api-access-wgd4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:16:10 crc kubenswrapper[4717]: I0217 15:16:10.915168 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2047a6dc-1a1c-4b26-bee6-c16d812a99df-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2047a6dc-1a1c-4b26-bee6-c16d812a99df" (UID: "2047a6dc-1a1c-4b26-bee6-c16d812a99df"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:16:10 crc kubenswrapper[4717]: I0217 15:16:10.933466 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2047a6dc-1a1c-4b26-bee6-c16d812a99df-inventory" (OuterVolumeSpecName: "inventory") pod "2047a6dc-1a1c-4b26-bee6-c16d812a99df" (UID: "2047a6dc-1a1c-4b26-bee6-c16d812a99df"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:16:10 crc kubenswrapper[4717]: I0217 15:16:10.991579 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2047a6dc-1a1c-4b26-bee6-c16d812a99df-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:16:10 crc kubenswrapper[4717]: I0217 15:16:10.991627 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2047a6dc-1a1c-4b26-bee6-c16d812a99df-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:16:10 crc kubenswrapper[4717]: I0217 15:16:10.991648 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgd4p\" (UniqueName: \"kubernetes.io/projected/2047a6dc-1a1c-4b26-bee6-c16d812a99df-kube-api-access-wgd4p\") on node \"crc\" DevicePath \"\"" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.271850 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" event={"ID":"2047a6dc-1a1c-4b26-bee6-c16d812a99df","Type":"ContainerDied","Data":"8ff50de939d748f33b1c707309b6ac1bab50935efb734cffa6a62eaca29d453e"} Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.271894 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ff50de939d748f33b1c707309b6ac1bab50935efb734cffa6a62eaca29d453e" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.271942 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-678sx" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.356753 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk"] Feb 17 15:16:11 crc kubenswrapper[4717]: E0217 15:16:11.357229 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2047a6dc-1a1c-4b26-bee6-c16d812a99df" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.357246 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2047a6dc-1a1c-4b26-bee6-c16d812a99df" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.357439 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2047a6dc-1a1c-4b26-bee6-c16d812a99df" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.358080 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.361952 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.362267 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.362385 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.364052 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.373777 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk"] Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.500776 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.501003 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.501125 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x47xb\" (UniqueName: \"kubernetes.io/projected/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-kube-api-access-x47xb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.501224 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.602880 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x47xb\" (UniqueName: \"kubernetes.io/projected/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-kube-api-access-x47xb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.602946 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.603059 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.603128 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.606967 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.607496 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.608743 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.619498 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x47xb\" (UniqueName: \"kubernetes.io/projected/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-kube-api-access-x47xb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:11 crc kubenswrapper[4717]: I0217 15:16:11.685981 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:16:12 crc kubenswrapper[4717]: I0217 15:16:12.193531 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk"] Feb 17 15:16:12 crc kubenswrapper[4717]: W0217 15:16:12.200494 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc967ec5_ea6b_4da3_a6a4_c0a75f0a6d0c.slice/crio-773efba4f2bdce6fd188366e6091d3fd2d91a030b74c4d237a7135321aea3f8a WatchSource:0}: Error finding container 773efba4f2bdce6fd188366e6091d3fd2d91a030b74c4d237a7135321aea3f8a: Status 404 returned error can't find the container with id 773efba4f2bdce6fd188366e6091d3fd2d91a030b74c4d237a7135321aea3f8a Feb 17 15:16:12 crc kubenswrapper[4717]: I0217 15:16:12.282405 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" event={"ID":"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c","Type":"ContainerStarted","Data":"773efba4f2bdce6fd188366e6091d3fd2d91a030b74c4d237a7135321aea3f8a"} Feb 17 15:16:13 crc kubenswrapper[4717]: I0217 15:16:13.303806 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" event={"ID":"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c","Type":"ContainerStarted","Data":"2eff965323839be867d20947633730cb6bfc24bf4e4a1e37eb756c45df377c76"} Feb 17 15:16:13 crc kubenswrapper[4717]: I0217 15:16:13.326670 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" podStartSLOduration=1.924813934 podStartE2EDuration="2.326648038s" podCreationTimestamp="2026-02-17 15:16:11 +0000 UTC" firstStartedPulling="2026-02-17 15:16:12.205009547 +0000 UTC m=+1438.620850023" lastFinishedPulling="2026-02-17 15:16:12.606843651 +0000 UTC m=+1439.022684127" observedRunningTime="2026-02-17 15:16:13.321347018 +0000 UTC m=+1439.737187504" watchObservedRunningTime="2026-02-17 15:16:13.326648038 +0000 UTC m=+1439.742488514" Feb 17 15:16:13 crc kubenswrapper[4717]: I0217 15:16:13.948127 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hqcp2" podUID="399dcb0c-c150-4561-8220-475314c6b22f" containerName="registry-server" probeResult="failure" output=< Feb 17 15:16:13 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 17 15:16:13 crc kubenswrapper[4717]: > Feb 17 15:16:20 crc kubenswrapper[4717]: I0217 15:16:20.808268 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:16:20 crc kubenswrapper[4717]: I0217 15:16:20.809109 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:16:22 crc kubenswrapper[4717]: I0217 15:16:22.964744 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:16:23 crc kubenswrapper[4717]: I0217 15:16:23.008972 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:16:23 crc kubenswrapper[4717]: I0217 15:16:23.209440 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hqcp2"] Feb 17 15:16:24 crc kubenswrapper[4717]: I0217 15:16:24.416929 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hqcp2" podUID="399dcb0c-c150-4561-8220-475314c6b22f" containerName="registry-server" containerID="cri-o://68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d" gracePeriod=2 Feb 17 15:16:24 crc kubenswrapper[4717]: I0217 15:16:24.922364 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.107747 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399dcb0c-c150-4561-8220-475314c6b22f-catalog-content\") pod \"399dcb0c-c150-4561-8220-475314c6b22f\" (UID: \"399dcb0c-c150-4561-8220-475314c6b22f\") " Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.107825 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhzj4\" (UniqueName: \"kubernetes.io/projected/399dcb0c-c150-4561-8220-475314c6b22f-kube-api-access-hhzj4\") pod \"399dcb0c-c150-4561-8220-475314c6b22f\" (UID: \"399dcb0c-c150-4561-8220-475314c6b22f\") " Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.107861 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399dcb0c-c150-4561-8220-475314c6b22f-utilities\") pod \"399dcb0c-c150-4561-8220-475314c6b22f\" (UID: \"399dcb0c-c150-4561-8220-475314c6b22f\") " Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.108781 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/399dcb0c-c150-4561-8220-475314c6b22f-utilities" (OuterVolumeSpecName: "utilities") pod "399dcb0c-c150-4561-8220-475314c6b22f" (UID: "399dcb0c-c150-4561-8220-475314c6b22f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.116341 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399dcb0c-c150-4561-8220-475314c6b22f-kube-api-access-hhzj4" (OuterVolumeSpecName: "kube-api-access-hhzj4") pod "399dcb0c-c150-4561-8220-475314c6b22f" (UID: "399dcb0c-c150-4561-8220-475314c6b22f"). InnerVolumeSpecName "kube-api-access-hhzj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.209784 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhzj4\" (UniqueName: \"kubernetes.io/projected/399dcb0c-c150-4561-8220-475314c6b22f-kube-api-access-hhzj4\") on node \"crc\" DevicePath \"\"" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.209815 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399dcb0c-c150-4561-8220-475314c6b22f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.244184 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/399dcb0c-c150-4561-8220-475314c6b22f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "399dcb0c-c150-4561-8220-475314c6b22f" (UID: "399dcb0c-c150-4561-8220-475314c6b22f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.311413 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399dcb0c-c150-4561-8220-475314c6b22f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.429107 4717 generic.go:334] "Generic (PLEG): container finished" podID="399dcb0c-c150-4561-8220-475314c6b22f" containerID="68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d" exitCode=0 Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.429169 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcp2" event={"ID":"399dcb0c-c150-4561-8220-475314c6b22f","Type":"ContainerDied","Data":"68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d"} Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.429195 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqcp2" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.429212 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcp2" event={"ID":"399dcb0c-c150-4561-8220-475314c6b22f","Type":"ContainerDied","Data":"6ea4f09e115a08dcd318b85f0fbd6fed7e5c224df9f57ec23c390897d7455d13"} Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.429244 4717 scope.go:117] "RemoveContainer" containerID="68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.477253 4717 scope.go:117] "RemoveContainer" containerID="58fd78466cb7434ba5e35ee29241c589671c769da527159f3af0000099320a77" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.484225 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hqcp2"] Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.499848 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hqcp2"] Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.505473 4717 scope.go:117] "RemoveContainer" containerID="1d4d806e135abbe76f8170c0c685fb1e8020ee8a3d6d45381351b21a9e50a835" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.550064 4717 scope.go:117] "RemoveContainer" containerID="68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d" Feb 17 15:16:25 crc kubenswrapper[4717]: E0217 15:16:25.550879 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d\": container with ID starting with 68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d not found: ID does not exist" containerID="68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.551062 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d"} err="failed to get container status \"68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d\": rpc error: code = NotFound desc = could not find container \"68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d\": container with ID starting with 68ea919a01f74b3cdb5348df69c4c2d1749332058df831511f6d453e1f215d1d not found: ID does not exist" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.551167 4717 scope.go:117] "RemoveContainer" containerID="58fd78466cb7434ba5e35ee29241c589671c769da527159f3af0000099320a77" Feb 17 15:16:25 crc kubenswrapper[4717]: E0217 15:16:25.551568 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58fd78466cb7434ba5e35ee29241c589671c769da527159f3af0000099320a77\": container with ID starting with 58fd78466cb7434ba5e35ee29241c589671c769da527159f3af0000099320a77 not found: ID does not exist" containerID="58fd78466cb7434ba5e35ee29241c589671c769da527159f3af0000099320a77" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.551629 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fd78466cb7434ba5e35ee29241c589671c769da527159f3af0000099320a77"} err="failed to get container status \"58fd78466cb7434ba5e35ee29241c589671c769da527159f3af0000099320a77\": rpc error: code = NotFound desc = could not find container \"58fd78466cb7434ba5e35ee29241c589671c769da527159f3af0000099320a77\": container with ID starting with 58fd78466cb7434ba5e35ee29241c589671c769da527159f3af0000099320a77 not found: ID does not exist" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.551668 4717 scope.go:117] "RemoveContainer" containerID="1d4d806e135abbe76f8170c0c685fb1e8020ee8a3d6d45381351b21a9e50a835" Feb 17 15:16:25 crc kubenswrapper[4717]: E0217 15:16:25.552269 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d4d806e135abbe76f8170c0c685fb1e8020ee8a3d6d45381351b21a9e50a835\": container with ID starting with 1d4d806e135abbe76f8170c0c685fb1e8020ee8a3d6d45381351b21a9e50a835 not found: ID does not exist" containerID="1d4d806e135abbe76f8170c0c685fb1e8020ee8a3d6d45381351b21a9e50a835" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.552304 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d4d806e135abbe76f8170c0c685fb1e8020ee8a3d6d45381351b21a9e50a835"} err="failed to get container status \"1d4d806e135abbe76f8170c0c685fb1e8020ee8a3d6d45381351b21a9e50a835\": rpc error: code = NotFound desc = could not find container \"1d4d806e135abbe76f8170c0c685fb1e8020ee8a3d6d45381351b21a9e50a835\": container with ID starting with 1d4d806e135abbe76f8170c0c685fb1e8020ee8a3d6d45381351b21a9e50a835 not found: ID does not exist" Feb 17 15:16:25 crc kubenswrapper[4717]: I0217 15:16:25.859659 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="399dcb0c-c150-4561-8220-475314c6b22f" path="/var/lib/kubelet/pods/399dcb0c-c150-4561-8220-475314c6b22f/volumes" Feb 17 15:16:50 crc kubenswrapper[4717]: I0217 15:16:50.808604 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:16:50 crc kubenswrapper[4717]: I0217 15:16:50.810881 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:16:50 crc kubenswrapper[4717]: I0217 15:16:50.811125 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 15:16:50 crc kubenswrapper[4717]: I0217 15:16:50.812503 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fdf1380c8cd7cb66606575dee73871bdf907cf6128e72b679677baf87d933dc1"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:16:50 crc kubenswrapper[4717]: I0217 15:16:50.812806 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://fdf1380c8cd7cb66606575dee73871bdf907cf6128e72b679677baf87d933dc1" gracePeriod=600 Feb 17 15:16:51 crc kubenswrapper[4717]: I0217 15:16:51.724918 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="fdf1380c8cd7cb66606575dee73871bdf907cf6128e72b679677baf87d933dc1" exitCode=0 Feb 17 15:16:51 crc kubenswrapper[4717]: I0217 15:16:51.725014 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"fdf1380c8cd7cb66606575dee73871bdf907cf6128e72b679677baf87d933dc1"} Feb 17 15:16:51 crc kubenswrapper[4717]: I0217 15:16:51.725251 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea"} Feb 17 15:16:51 crc kubenswrapper[4717]: I0217 15:16:51.725280 4717 scope.go:117] "RemoveContainer" containerID="f8e20bac773d3781e4315850afd9f8f1df648a8ef53c688d37ae5161d1be4600" Feb 17 15:16:58 crc kubenswrapper[4717]: I0217 15:16:58.978688 4717 scope.go:117] "RemoveContainer" containerID="a580cee00e166254db5557e0ef7b3d2e3f6757cb5754a1391c64c4a5c7dbe9a4" Feb 17 15:16:59 crc kubenswrapper[4717]: I0217 15:16:59.022911 4717 scope.go:117] "RemoveContainer" containerID="84ed6405ea6951a592a433821a319d57e411cedb11f5a4732e6cd0b1381802b8" Feb 17 15:17:59 crc kubenswrapper[4717]: I0217 15:17:59.170891 4717 scope.go:117] "RemoveContainer" containerID="0ec438df1e3fc332f6f924cc4a302b0afd151ca5713495ba5c9b9a30c2ccf935" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.660309 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5nntp"] Feb 17 15:18:32 crc kubenswrapper[4717]: E0217 15:18:32.661263 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399dcb0c-c150-4561-8220-475314c6b22f" containerName="registry-server" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.661282 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="399dcb0c-c150-4561-8220-475314c6b22f" containerName="registry-server" Feb 17 15:18:32 crc kubenswrapper[4717]: E0217 15:18:32.661320 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399dcb0c-c150-4561-8220-475314c6b22f" containerName="extract-content" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.661329 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="399dcb0c-c150-4561-8220-475314c6b22f" containerName="extract-content" Feb 17 15:18:32 crc kubenswrapper[4717]: E0217 15:18:32.661350 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399dcb0c-c150-4561-8220-475314c6b22f" containerName="extract-utilities" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.661358 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="399dcb0c-c150-4561-8220-475314c6b22f" containerName="extract-utilities" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.661636 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="399dcb0c-c150-4561-8220-475314c6b22f" containerName="registry-server" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.663467 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.672925 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5nntp"] Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.804419 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dba58273-234e-468f-9155-ae97a54cce09-utilities\") pod \"redhat-marketplace-5nntp\" (UID: \"dba58273-234e-468f-9155-ae97a54cce09\") " pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.804847 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dba58273-234e-468f-9155-ae97a54cce09-catalog-content\") pod \"redhat-marketplace-5nntp\" (UID: \"dba58273-234e-468f-9155-ae97a54cce09\") " pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.804984 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgnk8\" (UniqueName: \"kubernetes.io/projected/dba58273-234e-468f-9155-ae97a54cce09-kube-api-access-rgnk8\") pod \"redhat-marketplace-5nntp\" (UID: \"dba58273-234e-468f-9155-ae97a54cce09\") " pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.906349 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dba58273-234e-468f-9155-ae97a54cce09-utilities\") pod \"redhat-marketplace-5nntp\" (UID: \"dba58273-234e-468f-9155-ae97a54cce09\") " pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.906466 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dba58273-234e-468f-9155-ae97a54cce09-catalog-content\") pod \"redhat-marketplace-5nntp\" (UID: \"dba58273-234e-468f-9155-ae97a54cce09\") " pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.906493 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgnk8\" (UniqueName: \"kubernetes.io/projected/dba58273-234e-468f-9155-ae97a54cce09-kube-api-access-rgnk8\") pod \"redhat-marketplace-5nntp\" (UID: \"dba58273-234e-468f-9155-ae97a54cce09\") " pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.907137 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dba58273-234e-468f-9155-ae97a54cce09-catalog-content\") pod \"redhat-marketplace-5nntp\" (UID: \"dba58273-234e-468f-9155-ae97a54cce09\") " pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.907258 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dba58273-234e-468f-9155-ae97a54cce09-utilities\") pod \"redhat-marketplace-5nntp\" (UID: \"dba58273-234e-468f-9155-ae97a54cce09\") " pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.929173 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgnk8\" (UniqueName: \"kubernetes.io/projected/dba58273-234e-468f-9155-ae97a54cce09-kube-api-access-rgnk8\") pod \"redhat-marketplace-5nntp\" (UID: \"dba58273-234e-468f-9155-ae97a54cce09\") " pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:32 crc kubenswrapper[4717]: I0217 15:18:32.999648 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:33 crc kubenswrapper[4717]: I0217 15:18:33.514580 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5nntp"] Feb 17 15:18:33 crc kubenswrapper[4717]: I0217 15:18:33.818019 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5nntp" event={"ID":"dba58273-234e-468f-9155-ae97a54cce09","Type":"ContainerStarted","Data":"be777a3f5197cdf10ae2a24ffe8e5dade09517a4a52f8f6438bdc948bfd8fc69"} Feb 17 15:18:34 crc kubenswrapper[4717]: I0217 15:18:34.832653 4717 generic.go:334] "Generic (PLEG): container finished" podID="dba58273-234e-468f-9155-ae97a54cce09" containerID="f2010b1b99f85a5f8dbdd7b3cb3d435c2076f41e2b0587db06f44bf3108c8153" exitCode=0 Feb 17 15:18:34 crc kubenswrapper[4717]: I0217 15:18:34.832995 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5nntp" event={"ID":"dba58273-234e-468f-9155-ae97a54cce09","Type":"ContainerDied","Data":"f2010b1b99f85a5f8dbdd7b3cb3d435c2076f41e2b0587db06f44bf3108c8153"} Feb 17 15:18:36 crc kubenswrapper[4717]: I0217 15:18:36.868948 4717 generic.go:334] "Generic (PLEG): container finished" podID="dba58273-234e-468f-9155-ae97a54cce09" containerID="2b6f62ac8b875de1546d27ee464b2206f9f11677465f73e4a80d0a5666405187" exitCode=0 Feb 17 15:18:36 crc kubenswrapper[4717]: I0217 15:18:36.869311 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5nntp" event={"ID":"dba58273-234e-468f-9155-ae97a54cce09","Type":"ContainerDied","Data":"2b6f62ac8b875de1546d27ee464b2206f9f11677465f73e4a80d0a5666405187"} Feb 17 15:18:37 crc kubenswrapper[4717]: I0217 15:18:37.882158 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5nntp" event={"ID":"dba58273-234e-468f-9155-ae97a54cce09","Type":"ContainerStarted","Data":"e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c"} Feb 17 15:18:37 crc kubenswrapper[4717]: I0217 15:18:37.903724 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5nntp" podStartSLOduration=3.401264698 podStartE2EDuration="5.903706443s" podCreationTimestamp="2026-02-17 15:18:32 +0000 UTC" firstStartedPulling="2026-02-17 15:18:34.837280146 +0000 UTC m=+1581.253120642" lastFinishedPulling="2026-02-17 15:18:37.339721921 +0000 UTC m=+1583.755562387" observedRunningTime="2026-02-17 15:18:37.89801302 +0000 UTC m=+1584.313853496" watchObservedRunningTime="2026-02-17 15:18:37.903706443 +0000 UTC m=+1584.319546909" Feb 17 15:18:43 crc kubenswrapper[4717]: I0217 15:18:43.000571 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:43 crc kubenswrapper[4717]: I0217 15:18:43.002803 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:43 crc kubenswrapper[4717]: I0217 15:18:43.060668 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:43 crc kubenswrapper[4717]: I0217 15:18:43.995561 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:44 crc kubenswrapper[4717]: I0217 15:18:44.059993 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5nntp"] Feb 17 15:18:45 crc kubenswrapper[4717]: I0217 15:18:45.956239 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5nntp" podUID="dba58273-234e-468f-9155-ae97a54cce09" containerName="registry-server" containerID="cri-o://e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c" gracePeriod=2 Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.707360 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.835247 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dba58273-234e-468f-9155-ae97a54cce09-catalog-content\") pod \"dba58273-234e-468f-9155-ae97a54cce09\" (UID: \"dba58273-234e-468f-9155-ae97a54cce09\") " Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.835351 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dba58273-234e-468f-9155-ae97a54cce09-utilities\") pod \"dba58273-234e-468f-9155-ae97a54cce09\" (UID: \"dba58273-234e-468f-9155-ae97a54cce09\") " Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.835452 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgnk8\" (UniqueName: \"kubernetes.io/projected/dba58273-234e-468f-9155-ae97a54cce09-kube-api-access-rgnk8\") pod \"dba58273-234e-468f-9155-ae97a54cce09\" (UID: \"dba58273-234e-468f-9155-ae97a54cce09\") " Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.836461 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dba58273-234e-468f-9155-ae97a54cce09-utilities" (OuterVolumeSpecName: "utilities") pod "dba58273-234e-468f-9155-ae97a54cce09" (UID: "dba58273-234e-468f-9155-ae97a54cce09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.845407 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dba58273-234e-468f-9155-ae97a54cce09-kube-api-access-rgnk8" (OuterVolumeSpecName: "kube-api-access-rgnk8") pod "dba58273-234e-468f-9155-ae97a54cce09" (UID: "dba58273-234e-468f-9155-ae97a54cce09"). InnerVolumeSpecName "kube-api-access-rgnk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.880756 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dba58273-234e-468f-9155-ae97a54cce09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dba58273-234e-468f-9155-ae97a54cce09" (UID: "dba58273-234e-468f-9155-ae97a54cce09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.937942 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dba58273-234e-468f-9155-ae97a54cce09-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.937991 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dba58273-234e-468f-9155-ae97a54cce09-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.938019 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgnk8\" (UniqueName: \"kubernetes.io/projected/dba58273-234e-468f-9155-ae97a54cce09-kube-api-access-rgnk8\") on node \"crc\" DevicePath \"\"" Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.967217 4717 generic.go:334] "Generic (PLEG): container finished" podID="dba58273-234e-468f-9155-ae97a54cce09" containerID="e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c" exitCode=0 Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.967256 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5nntp" event={"ID":"dba58273-234e-468f-9155-ae97a54cce09","Type":"ContainerDied","Data":"e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c"} Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.967283 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5nntp" event={"ID":"dba58273-234e-468f-9155-ae97a54cce09","Type":"ContainerDied","Data":"be777a3f5197cdf10ae2a24ffe8e5dade09517a4a52f8f6438bdc948bfd8fc69"} Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.967302 4717 scope.go:117] "RemoveContainer" containerID="e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c" Feb 17 15:18:46 crc kubenswrapper[4717]: I0217 15:18:46.967427 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5nntp" Feb 17 15:18:47 crc kubenswrapper[4717]: I0217 15:18:47.002818 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5nntp"] Feb 17 15:18:47 crc kubenswrapper[4717]: I0217 15:18:47.008495 4717 scope.go:117] "RemoveContainer" containerID="2b6f62ac8b875de1546d27ee464b2206f9f11677465f73e4a80d0a5666405187" Feb 17 15:18:47 crc kubenswrapper[4717]: I0217 15:18:47.012444 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5nntp"] Feb 17 15:18:47 crc kubenswrapper[4717]: I0217 15:18:47.033728 4717 scope.go:117] "RemoveContainer" containerID="f2010b1b99f85a5f8dbdd7b3cb3d435c2076f41e2b0587db06f44bf3108c8153" Feb 17 15:18:47 crc kubenswrapper[4717]: I0217 15:18:47.079511 4717 scope.go:117] "RemoveContainer" containerID="e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c" Feb 17 15:18:47 crc kubenswrapper[4717]: E0217 15:18:47.080028 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c\": container with ID starting with e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c not found: ID does not exist" containerID="e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c" Feb 17 15:18:47 crc kubenswrapper[4717]: I0217 15:18:47.080059 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c"} err="failed to get container status \"e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c\": rpc error: code = NotFound desc = could not find container \"e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c\": container with ID starting with e0fbf68f9f7f26b48de07fd0e6203c65ee249993f9bcf0be14e7a4972004bd7c not found: ID does not exist" Feb 17 15:18:47 crc kubenswrapper[4717]: I0217 15:18:47.080202 4717 scope.go:117] "RemoveContainer" containerID="2b6f62ac8b875de1546d27ee464b2206f9f11677465f73e4a80d0a5666405187" Feb 17 15:18:47 crc kubenswrapper[4717]: E0217 15:18:47.080828 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6f62ac8b875de1546d27ee464b2206f9f11677465f73e4a80d0a5666405187\": container with ID starting with 2b6f62ac8b875de1546d27ee464b2206f9f11677465f73e4a80d0a5666405187 not found: ID does not exist" containerID="2b6f62ac8b875de1546d27ee464b2206f9f11677465f73e4a80d0a5666405187" Feb 17 15:18:47 crc kubenswrapper[4717]: I0217 15:18:47.080883 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6f62ac8b875de1546d27ee464b2206f9f11677465f73e4a80d0a5666405187"} err="failed to get container status \"2b6f62ac8b875de1546d27ee464b2206f9f11677465f73e4a80d0a5666405187\": rpc error: code = NotFound desc = could not find container \"2b6f62ac8b875de1546d27ee464b2206f9f11677465f73e4a80d0a5666405187\": container with ID starting with 2b6f62ac8b875de1546d27ee464b2206f9f11677465f73e4a80d0a5666405187 not found: ID does not exist" Feb 17 15:18:47 crc kubenswrapper[4717]: I0217 15:18:47.080913 4717 scope.go:117] "RemoveContainer" containerID="f2010b1b99f85a5f8dbdd7b3cb3d435c2076f41e2b0587db06f44bf3108c8153" Feb 17 15:18:47 crc kubenswrapper[4717]: E0217 15:18:47.081463 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2010b1b99f85a5f8dbdd7b3cb3d435c2076f41e2b0587db06f44bf3108c8153\": container with ID starting with f2010b1b99f85a5f8dbdd7b3cb3d435c2076f41e2b0587db06f44bf3108c8153 not found: ID does not exist" containerID="f2010b1b99f85a5f8dbdd7b3cb3d435c2076f41e2b0587db06f44bf3108c8153" Feb 17 15:18:47 crc kubenswrapper[4717]: I0217 15:18:47.081492 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2010b1b99f85a5f8dbdd7b3cb3d435c2076f41e2b0587db06f44bf3108c8153"} err="failed to get container status \"f2010b1b99f85a5f8dbdd7b3cb3d435c2076f41e2b0587db06f44bf3108c8153\": rpc error: code = NotFound desc = could not find container \"f2010b1b99f85a5f8dbdd7b3cb3d435c2076f41e2b0587db06f44bf3108c8153\": container with ID starting with f2010b1b99f85a5f8dbdd7b3cb3d435c2076f41e2b0587db06f44bf3108c8153 not found: ID does not exist" Feb 17 15:18:47 crc kubenswrapper[4717]: I0217 15:18:47.857242 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dba58273-234e-468f-9155-ae97a54cce09" path="/var/lib/kubelet/pods/dba58273-234e-468f-9155-ae97a54cce09/volumes" Feb 17 15:19:05 crc kubenswrapper[4717]: I0217 15:19:05.167139 4717 generic.go:334] "Generic (PLEG): container finished" podID="fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c" containerID="2eff965323839be867d20947633730cb6bfc24bf4e4a1e37eb756c45df377c76" exitCode=0 Feb 17 15:19:05 crc kubenswrapper[4717]: I0217 15:19:05.167269 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" event={"ID":"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c","Type":"ContainerDied","Data":"2eff965323839be867d20947633730cb6bfc24bf4e4a1e37eb756c45df377c76"} Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.766062 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.857231 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-inventory\") pod \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.857320 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-ssh-key-openstack-edpm-ipam\") pod \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.857390 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x47xb\" (UniqueName: \"kubernetes.io/projected/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-kube-api-access-x47xb\") pod \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.857484 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-bootstrap-combined-ca-bundle\") pod \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\" (UID: \"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c\") " Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.866550 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-kube-api-access-x47xb" (OuterVolumeSpecName: "kube-api-access-x47xb") pod "fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c" (UID: "fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c"). InnerVolumeSpecName "kube-api-access-x47xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.871945 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c" (UID: "fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.896825 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c" (UID: "fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.920375 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-inventory" (OuterVolumeSpecName: "inventory") pod "fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c" (UID: "fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.960975 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.961057 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x47xb\" (UniqueName: \"kubernetes.io/projected/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-kube-api-access-x47xb\") on node \"crc\" DevicePath \"\"" Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.961068 4717 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:19:06 crc kubenswrapper[4717]: I0217 15:19:06.961096 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.195010 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" event={"ID":"fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c","Type":"ContainerDied","Data":"773efba4f2bdce6fd188366e6091d3fd2d91a030b74c4d237a7135321aea3f8a"} Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.195220 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="773efba4f2bdce6fd188366e6091d3fd2d91a030b74c4d237a7135321aea3f8a" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.195118 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.286749 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s"] Feb 17 15:19:07 crc kubenswrapper[4717]: E0217 15:19:07.287327 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dba58273-234e-468f-9155-ae97a54cce09" containerName="extract-utilities" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.287350 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba58273-234e-468f-9155-ae97a54cce09" containerName="extract-utilities" Feb 17 15:19:07 crc kubenswrapper[4717]: E0217 15:19:07.287379 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.287390 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 15:19:07 crc kubenswrapper[4717]: E0217 15:19:07.287407 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dba58273-234e-468f-9155-ae97a54cce09" containerName="registry-server" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.287416 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba58273-234e-468f-9155-ae97a54cce09" containerName="registry-server" Feb 17 15:19:07 crc kubenswrapper[4717]: E0217 15:19:07.287439 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dba58273-234e-468f-9155-ae97a54cce09" containerName="extract-content" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.287447 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba58273-234e-468f-9155-ae97a54cce09" containerName="extract-content" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.287665 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="dba58273-234e-468f-9155-ae97a54cce09" containerName="registry-server" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.287698 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.288422 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.290928 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.290992 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.291303 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.291595 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.301810 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s"] Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.371550 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f01b139-f61b-4935-930c-65756bd54cdc-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xt86s\" (UID: \"6f01b139-f61b-4935-930c-65756bd54cdc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.372453 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm6hd\" (UniqueName: \"kubernetes.io/projected/6f01b139-f61b-4935-930c-65756bd54cdc-kube-api-access-pm6hd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xt86s\" (UID: \"6f01b139-f61b-4935-930c-65756bd54cdc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.372519 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f01b139-f61b-4935-930c-65756bd54cdc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xt86s\" (UID: \"6f01b139-f61b-4935-930c-65756bd54cdc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.475396 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f01b139-f61b-4935-930c-65756bd54cdc-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xt86s\" (UID: \"6f01b139-f61b-4935-930c-65756bd54cdc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.475566 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm6hd\" (UniqueName: \"kubernetes.io/projected/6f01b139-f61b-4935-930c-65756bd54cdc-kube-api-access-pm6hd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xt86s\" (UID: \"6f01b139-f61b-4935-930c-65756bd54cdc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.475597 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f01b139-f61b-4935-930c-65756bd54cdc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xt86s\" (UID: \"6f01b139-f61b-4935-930c-65756bd54cdc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.486277 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f01b139-f61b-4935-930c-65756bd54cdc-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xt86s\" (UID: \"6f01b139-f61b-4935-930c-65756bd54cdc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.502327 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f01b139-f61b-4935-930c-65756bd54cdc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xt86s\" (UID: \"6f01b139-f61b-4935-930c-65756bd54cdc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.516418 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm6hd\" (UniqueName: \"kubernetes.io/projected/6f01b139-f61b-4935-930c-65756bd54cdc-kube-api-access-pm6hd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xt86s\" (UID: \"6f01b139-f61b-4935-930c-65756bd54cdc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:19:07 crc kubenswrapper[4717]: I0217 15:19:07.635627 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:19:08 crc kubenswrapper[4717]: I0217 15:19:08.172909 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s"] Feb 17 15:19:08 crc kubenswrapper[4717]: I0217 15:19:08.204509 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" event={"ID":"6f01b139-f61b-4935-930c-65756bd54cdc","Type":"ContainerStarted","Data":"3d783df7a55961f025af1dac3fa658c117720451e55bb3afda54acea9fc441cc"} Feb 17 15:19:09 crc kubenswrapper[4717]: I0217 15:19:09.217612 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" event={"ID":"6f01b139-f61b-4935-930c-65756bd54cdc","Type":"ContainerStarted","Data":"2e4fc808e90b629c0ac4254628ef39cd50f10fac9aa20c5b6a8f5c6ad2370ad1"} Feb 17 15:19:09 crc kubenswrapper[4717]: I0217 15:19:09.236746 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" podStartSLOduration=1.779450133 podStartE2EDuration="2.236731392s" podCreationTimestamp="2026-02-17 15:19:07 +0000 UTC" firstStartedPulling="2026-02-17 15:19:08.171310583 +0000 UTC m=+1614.587151059" lastFinishedPulling="2026-02-17 15:19:08.628591842 +0000 UTC m=+1615.044432318" observedRunningTime="2026-02-17 15:19:09.233739496 +0000 UTC m=+1615.649579972" watchObservedRunningTime="2026-02-17 15:19:09.236731392 +0000 UTC m=+1615.652571868" Feb 17 15:19:20 crc kubenswrapper[4717]: I0217 15:19:20.808349 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:19:20 crc kubenswrapper[4717]: I0217 15:19:20.810232 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.403770 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k6frz"] Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.406852 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.422131 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6frz"] Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.533176 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c63f6693-42df-44bc-b880-09f22756d97f-utilities\") pod \"certified-operators-k6frz\" (UID: \"c63f6693-42df-44bc-b880-09f22756d97f\") " pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.533239 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnmfq\" (UniqueName: \"kubernetes.io/projected/c63f6693-42df-44bc-b880-09f22756d97f-kube-api-access-qnmfq\") pod \"certified-operators-k6frz\" (UID: \"c63f6693-42df-44bc-b880-09f22756d97f\") " pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.533323 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c63f6693-42df-44bc-b880-09f22756d97f-catalog-content\") pod \"certified-operators-k6frz\" (UID: \"c63f6693-42df-44bc-b880-09f22756d97f\") " pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.635044 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c63f6693-42df-44bc-b880-09f22756d97f-catalog-content\") pod \"certified-operators-k6frz\" (UID: \"c63f6693-42df-44bc-b880-09f22756d97f\") " pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.635239 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c63f6693-42df-44bc-b880-09f22756d97f-utilities\") pod \"certified-operators-k6frz\" (UID: \"c63f6693-42df-44bc-b880-09f22756d97f\") " pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.635298 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnmfq\" (UniqueName: \"kubernetes.io/projected/c63f6693-42df-44bc-b880-09f22756d97f-kube-api-access-qnmfq\") pod \"certified-operators-k6frz\" (UID: \"c63f6693-42df-44bc-b880-09f22756d97f\") " pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.635945 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c63f6693-42df-44bc-b880-09f22756d97f-catalog-content\") pod \"certified-operators-k6frz\" (UID: \"c63f6693-42df-44bc-b880-09f22756d97f\") " pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.635971 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c63f6693-42df-44bc-b880-09f22756d97f-utilities\") pod \"certified-operators-k6frz\" (UID: \"c63f6693-42df-44bc-b880-09f22756d97f\") " pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.663537 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnmfq\" (UniqueName: \"kubernetes.io/projected/c63f6693-42df-44bc-b880-09f22756d97f-kube-api-access-qnmfq\") pod \"certified-operators-k6frz\" (UID: \"c63f6693-42df-44bc-b880-09f22756d97f\") " pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:47 crc kubenswrapper[4717]: I0217 15:19:47.750842 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:48 crc kubenswrapper[4717]: I0217 15:19:48.041998 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6frz"] Feb 17 15:19:48 crc kubenswrapper[4717]: I0217 15:19:48.649397 4717 generic.go:334] "Generic (PLEG): container finished" podID="c63f6693-42df-44bc-b880-09f22756d97f" containerID="a18a38171d7d9837bd371f28abe6ce13873812e4a80298a79cb1023f4cd3ccd1" exitCode=0 Feb 17 15:19:48 crc kubenswrapper[4717]: I0217 15:19:48.649487 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6frz" event={"ID":"c63f6693-42df-44bc-b880-09f22756d97f","Type":"ContainerDied","Data":"a18a38171d7d9837bd371f28abe6ce13873812e4a80298a79cb1023f4cd3ccd1"} Feb 17 15:19:48 crc kubenswrapper[4717]: I0217 15:19:48.649818 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6frz" event={"ID":"c63f6693-42df-44bc-b880-09f22756d97f","Type":"ContainerStarted","Data":"039e342604fc7eecfe6fbb75466ba46845592eb6321650abb0d54aeae45ec11d"} Feb 17 15:19:48 crc kubenswrapper[4717]: I0217 15:19:48.652230 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:19:49 crc kubenswrapper[4717]: I0217 15:19:49.661578 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6frz" event={"ID":"c63f6693-42df-44bc-b880-09f22756d97f","Type":"ContainerStarted","Data":"7c7b29d938f03a0c3ee117699d0add7fd13d9edbb8abc675938f582308e0e675"} Feb 17 15:19:50 crc kubenswrapper[4717]: I0217 15:19:50.683231 4717 generic.go:334] "Generic (PLEG): container finished" podID="c63f6693-42df-44bc-b880-09f22756d97f" containerID="7c7b29d938f03a0c3ee117699d0add7fd13d9edbb8abc675938f582308e0e675" exitCode=0 Feb 17 15:19:50 crc kubenswrapper[4717]: I0217 15:19:50.683319 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6frz" event={"ID":"c63f6693-42df-44bc-b880-09f22756d97f","Type":"ContainerDied","Data":"7c7b29d938f03a0c3ee117699d0add7fd13d9edbb8abc675938f582308e0e675"} Feb 17 15:19:50 crc kubenswrapper[4717]: I0217 15:19:50.808322 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:19:50 crc kubenswrapper[4717]: I0217 15:19:50.809128 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:19:51 crc kubenswrapper[4717]: I0217 15:19:51.701418 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6frz" event={"ID":"c63f6693-42df-44bc-b880-09f22756d97f","Type":"ContainerStarted","Data":"6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac"} Feb 17 15:19:51 crc kubenswrapper[4717]: I0217 15:19:51.745786 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k6frz" podStartSLOduration=2.132270545 podStartE2EDuration="4.7457558s" podCreationTimestamp="2026-02-17 15:19:47 +0000 UTC" firstStartedPulling="2026-02-17 15:19:48.651826326 +0000 UTC m=+1655.067666832" lastFinishedPulling="2026-02-17 15:19:51.265311571 +0000 UTC m=+1657.681152087" observedRunningTime="2026-02-17 15:19:51.734219971 +0000 UTC m=+1658.150060497" watchObservedRunningTime="2026-02-17 15:19:51.7457558 +0000 UTC m=+1658.161596316" Feb 17 15:19:57 crc kubenswrapper[4717]: I0217 15:19:57.751265 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:57 crc kubenswrapper[4717]: I0217 15:19:57.752033 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:57 crc kubenswrapper[4717]: I0217 15:19:57.817981 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:57 crc kubenswrapper[4717]: I0217 15:19:57.876449 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:19:58 crc kubenswrapper[4717]: I0217 15:19:58.074317 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k6frz"] Feb 17 15:19:59 crc kubenswrapper[4717]: I0217 15:19:59.325446 4717 scope.go:117] "RemoveContainer" containerID="635011d10084309b67449de60d2f160275fd59df48d3ec1d9679356789362eac" Feb 17 15:19:59 crc kubenswrapper[4717]: I0217 15:19:59.365287 4717 scope.go:117] "RemoveContainer" containerID="78b83d7a05fc7491d26d0f183c14475d542e4f23c36f1d2b6f518e567e67f749" Feb 17 15:19:59 crc kubenswrapper[4717]: I0217 15:19:59.825433 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k6frz" podUID="c63f6693-42df-44bc-b880-09f22756d97f" containerName="registry-server" containerID="cri-o://6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac" gracePeriod=2 Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.376659 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.442057 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c63f6693-42df-44bc-b880-09f22756d97f-catalog-content\") pod \"c63f6693-42df-44bc-b880-09f22756d97f\" (UID: \"c63f6693-42df-44bc-b880-09f22756d97f\") " Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.442187 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnmfq\" (UniqueName: \"kubernetes.io/projected/c63f6693-42df-44bc-b880-09f22756d97f-kube-api-access-qnmfq\") pod \"c63f6693-42df-44bc-b880-09f22756d97f\" (UID: \"c63f6693-42df-44bc-b880-09f22756d97f\") " Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.442319 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c63f6693-42df-44bc-b880-09f22756d97f-utilities\") pod \"c63f6693-42df-44bc-b880-09f22756d97f\" (UID: \"c63f6693-42df-44bc-b880-09f22756d97f\") " Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.443881 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c63f6693-42df-44bc-b880-09f22756d97f-utilities" (OuterVolumeSpecName: "utilities") pod "c63f6693-42df-44bc-b880-09f22756d97f" (UID: "c63f6693-42df-44bc-b880-09f22756d97f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.448191 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c63f6693-42df-44bc-b880-09f22756d97f-kube-api-access-qnmfq" (OuterVolumeSpecName: "kube-api-access-qnmfq") pod "c63f6693-42df-44bc-b880-09f22756d97f" (UID: "c63f6693-42df-44bc-b880-09f22756d97f"). InnerVolumeSpecName "kube-api-access-qnmfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.545588 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnmfq\" (UniqueName: \"kubernetes.io/projected/c63f6693-42df-44bc-b880-09f22756d97f-kube-api-access-qnmfq\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.545632 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c63f6693-42df-44bc-b880-09f22756d97f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.844209 4717 generic.go:334] "Generic (PLEG): container finished" podID="c63f6693-42df-44bc-b880-09f22756d97f" containerID="6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac" exitCode=0 Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.844282 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6frz" event={"ID":"c63f6693-42df-44bc-b880-09f22756d97f","Type":"ContainerDied","Data":"6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac"} Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.844337 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6frz" event={"ID":"c63f6693-42df-44bc-b880-09f22756d97f","Type":"ContainerDied","Data":"039e342604fc7eecfe6fbb75466ba46845592eb6321650abb0d54aeae45ec11d"} Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.844352 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6frz" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.844370 4717 scope.go:117] "RemoveContainer" containerID="6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.880666 4717 scope.go:117] "RemoveContainer" containerID="7c7b29d938f03a0c3ee117699d0add7fd13d9edbb8abc675938f582308e0e675" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.926704 4717 scope.go:117] "RemoveContainer" containerID="a18a38171d7d9837bd371f28abe6ce13873812e4a80298a79cb1023f4cd3ccd1" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.988288 4717 scope.go:117] "RemoveContainer" containerID="6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac" Feb 17 15:20:00 crc kubenswrapper[4717]: E0217 15:20:00.988745 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac\": container with ID starting with 6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac not found: ID does not exist" containerID="6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.988800 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac"} err="failed to get container status \"6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac\": rpc error: code = NotFound desc = could not find container \"6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac\": container with ID starting with 6f2d93259c530122f6444e5fb24d7d992c6eaac859fb9af27a732e479d7af1ac not found: ID does not exist" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.988835 4717 scope.go:117] "RemoveContainer" containerID="7c7b29d938f03a0c3ee117699d0add7fd13d9edbb8abc675938f582308e0e675" Feb 17 15:20:00 crc kubenswrapper[4717]: E0217 15:20:00.989280 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c7b29d938f03a0c3ee117699d0add7fd13d9edbb8abc675938f582308e0e675\": container with ID starting with 7c7b29d938f03a0c3ee117699d0add7fd13d9edbb8abc675938f582308e0e675 not found: ID does not exist" containerID="7c7b29d938f03a0c3ee117699d0add7fd13d9edbb8abc675938f582308e0e675" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.989322 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c7b29d938f03a0c3ee117699d0add7fd13d9edbb8abc675938f582308e0e675"} err="failed to get container status \"7c7b29d938f03a0c3ee117699d0add7fd13d9edbb8abc675938f582308e0e675\": rpc error: code = NotFound desc = could not find container \"7c7b29d938f03a0c3ee117699d0add7fd13d9edbb8abc675938f582308e0e675\": container with ID starting with 7c7b29d938f03a0c3ee117699d0add7fd13d9edbb8abc675938f582308e0e675 not found: ID does not exist" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.989347 4717 scope.go:117] "RemoveContainer" containerID="a18a38171d7d9837bd371f28abe6ce13873812e4a80298a79cb1023f4cd3ccd1" Feb 17 15:20:00 crc kubenswrapper[4717]: E0217 15:20:00.989714 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a18a38171d7d9837bd371f28abe6ce13873812e4a80298a79cb1023f4cd3ccd1\": container with ID starting with a18a38171d7d9837bd371f28abe6ce13873812e4a80298a79cb1023f4cd3ccd1 not found: ID does not exist" containerID="a18a38171d7d9837bd371f28abe6ce13873812e4a80298a79cb1023f4cd3ccd1" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.989753 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18a38171d7d9837bd371f28abe6ce13873812e4a80298a79cb1023f4cd3ccd1"} err="failed to get container status \"a18a38171d7d9837bd371f28abe6ce13873812e4a80298a79cb1023f4cd3ccd1\": rpc error: code = NotFound desc = could not find container \"a18a38171d7d9837bd371f28abe6ce13873812e4a80298a79cb1023f4cd3ccd1\": container with ID starting with a18a38171d7d9837bd371f28abe6ce13873812e4a80298a79cb1023f4cd3ccd1 not found: ID does not exist" Feb 17 15:20:00 crc kubenswrapper[4717]: I0217 15:20:00.998914 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c63f6693-42df-44bc-b880-09f22756d97f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c63f6693-42df-44bc-b880-09f22756d97f" (UID: "c63f6693-42df-44bc-b880-09f22756d97f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:01 crc kubenswrapper[4717]: I0217 15:20:01.056800 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c63f6693-42df-44bc-b880-09f22756d97f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:01 crc kubenswrapper[4717]: I0217 15:20:01.181268 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k6frz"] Feb 17 15:20:01 crc kubenswrapper[4717]: I0217 15:20:01.190965 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k6frz"] Feb 17 15:20:01 crc kubenswrapper[4717]: I0217 15:20:01.867772 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c63f6693-42df-44bc-b880-09f22756d97f" path="/var/lib/kubelet/pods/c63f6693-42df-44bc-b880-09f22756d97f/volumes" Feb 17 15:20:20 crc kubenswrapper[4717]: I0217 15:20:20.808246 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:20:20 crc kubenswrapper[4717]: I0217 15:20:20.808819 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:20:20 crc kubenswrapper[4717]: I0217 15:20:20.808888 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 15:20:20 crc kubenswrapper[4717]: I0217 15:20:20.809716 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:20:20 crc kubenswrapper[4717]: I0217 15:20:20.809783 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" gracePeriod=600 Feb 17 15:20:20 crc kubenswrapper[4717]: E0217 15:20:20.939914 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:20:21 crc kubenswrapper[4717]: I0217 15:20:21.096936 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" exitCode=0 Feb 17 15:20:21 crc kubenswrapper[4717]: I0217 15:20:21.097003 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea"} Feb 17 15:20:21 crc kubenswrapper[4717]: I0217 15:20:21.097385 4717 scope.go:117] "RemoveContainer" containerID="fdf1380c8cd7cb66606575dee73871bdf907cf6128e72b679677baf87d933dc1" Feb 17 15:20:21 crc kubenswrapper[4717]: I0217 15:20:21.098135 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:20:21 crc kubenswrapper[4717]: E0217 15:20:21.098497 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:20:26 crc kubenswrapper[4717]: I0217 15:20:26.057746 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-fc2mw"] Feb 17 15:20:26 crc kubenswrapper[4717]: I0217 15:20:26.069382 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-fc2mw"] Feb 17 15:20:27 crc kubenswrapper[4717]: I0217 15:20:27.867380 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c92c72e4-581d-4c2d-939b-7a7403142041" path="/var/lib/kubelet/pods/c92c72e4-581d-4c2d-939b-7a7403142041/volumes" Feb 17 15:20:34 crc kubenswrapper[4717]: I0217 15:20:34.847177 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:20:34 crc kubenswrapper[4717]: E0217 15:20:34.848454 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:20:36 crc kubenswrapper[4717]: I0217 15:20:36.053947 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-fxcxg"] Feb 17 15:20:36 crc kubenswrapper[4717]: I0217 15:20:36.072296 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-fxcxg"] Feb 17 15:20:37 crc kubenswrapper[4717]: I0217 15:20:37.284948 4717 generic.go:334] "Generic (PLEG): container finished" podID="6f01b139-f61b-4935-930c-65756bd54cdc" containerID="2e4fc808e90b629c0ac4254628ef39cd50f10fac9aa20c5b6a8f5c6ad2370ad1" exitCode=0 Feb 17 15:20:37 crc kubenswrapper[4717]: I0217 15:20:37.285010 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" event={"ID":"6f01b139-f61b-4935-930c-65756bd54cdc","Type":"ContainerDied","Data":"2e4fc808e90b629c0ac4254628ef39cd50f10fac9aa20c5b6a8f5c6ad2370ad1"} Feb 17 15:20:37 crc kubenswrapper[4717]: I0217 15:20:37.881539 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76fdb52f-cf38-4b2d-8997-ebcb0395e6ea" path="/var/lib/kubelet/pods/76fdb52f-cf38-4b2d-8997-ebcb0395e6ea/volumes" Feb 17 15:20:38 crc kubenswrapper[4717]: I0217 15:20:38.795176 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:20:38 crc kubenswrapper[4717]: I0217 15:20:38.871057 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f01b139-f61b-4935-930c-65756bd54cdc-inventory\") pod \"6f01b139-f61b-4935-930c-65756bd54cdc\" (UID: \"6f01b139-f61b-4935-930c-65756bd54cdc\") " Feb 17 15:20:38 crc kubenswrapper[4717]: I0217 15:20:38.871478 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm6hd\" (UniqueName: \"kubernetes.io/projected/6f01b139-f61b-4935-930c-65756bd54cdc-kube-api-access-pm6hd\") pod \"6f01b139-f61b-4935-930c-65756bd54cdc\" (UID: \"6f01b139-f61b-4935-930c-65756bd54cdc\") " Feb 17 15:20:38 crc kubenswrapper[4717]: I0217 15:20:38.871579 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f01b139-f61b-4935-930c-65756bd54cdc-ssh-key-openstack-edpm-ipam\") pod \"6f01b139-f61b-4935-930c-65756bd54cdc\" (UID: \"6f01b139-f61b-4935-930c-65756bd54cdc\") " Feb 17 15:20:38 crc kubenswrapper[4717]: I0217 15:20:38.879414 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f01b139-f61b-4935-930c-65756bd54cdc-kube-api-access-pm6hd" (OuterVolumeSpecName: "kube-api-access-pm6hd") pod "6f01b139-f61b-4935-930c-65756bd54cdc" (UID: "6f01b139-f61b-4935-930c-65756bd54cdc"). InnerVolumeSpecName "kube-api-access-pm6hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:38 crc kubenswrapper[4717]: I0217 15:20:38.896331 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f01b139-f61b-4935-930c-65756bd54cdc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6f01b139-f61b-4935-930c-65756bd54cdc" (UID: "6f01b139-f61b-4935-930c-65756bd54cdc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:38 crc kubenswrapper[4717]: I0217 15:20:38.899146 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f01b139-f61b-4935-930c-65756bd54cdc-inventory" (OuterVolumeSpecName: "inventory") pod "6f01b139-f61b-4935-930c-65756bd54cdc" (UID: "6f01b139-f61b-4935-930c-65756bd54cdc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:38 crc kubenswrapper[4717]: I0217 15:20:38.975159 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm6hd\" (UniqueName: \"kubernetes.io/projected/6f01b139-f61b-4935-930c-65756bd54cdc-kube-api-access-pm6hd\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:38 crc kubenswrapper[4717]: I0217 15:20:38.975205 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f01b139-f61b-4935-930c-65756bd54cdc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:38 crc kubenswrapper[4717]: I0217 15:20:38.975225 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f01b139-f61b-4935-930c-65756bd54cdc-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.031399 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-tcpcx"] Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.041877 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-tcpcx"] Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.310325 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" event={"ID":"6f01b139-f61b-4935-930c-65756bd54cdc","Type":"ContainerDied","Data":"3d783df7a55961f025af1dac3fa658c117720451e55bb3afda54acea9fc441cc"} Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.310633 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d783df7a55961f025af1dac3fa658c117720451e55bb3afda54acea9fc441cc" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.310411 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xt86s" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.480114 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh"] Feb 17 15:20:39 crc kubenswrapper[4717]: E0217 15:20:39.480552 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63f6693-42df-44bc-b880-09f22756d97f" containerName="extract-content" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.480571 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63f6693-42df-44bc-b880-09f22756d97f" containerName="extract-content" Feb 17 15:20:39 crc kubenswrapper[4717]: E0217 15:20:39.480595 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63f6693-42df-44bc-b880-09f22756d97f" containerName="registry-server" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.480604 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63f6693-42df-44bc-b880-09f22756d97f" containerName="registry-server" Feb 17 15:20:39 crc kubenswrapper[4717]: E0217 15:20:39.480614 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63f6693-42df-44bc-b880-09f22756d97f" containerName="extract-utilities" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.480622 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63f6693-42df-44bc-b880-09f22756d97f" containerName="extract-utilities" Feb 17 15:20:39 crc kubenswrapper[4717]: E0217 15:20:39.480653 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f01b139-f61b-4935-930c-65756bd54cdc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.480661 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f01b139-f61b-4935-930c-65756bd54cdc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.480875 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f01b139-f61b-4935-930c-65756bd54cdc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.480906 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c63f6693-42df-44bc-b880-09f22756d97f" containerName="registry-server" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.481636 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.484244 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.484513 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.485017 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.486223 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.492238 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh"] Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.585907 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh\" (UID: \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.586010 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcr8m\" (UniqueName: \"kubernetes.io/projected/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-kube-api-access-bcr8m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh\" (UID: \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.586047 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh\" (UID: \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.688972 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh\" (UID: \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.689195 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcr8m\" (UniqueName: \"kubernetes.io/projected/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-kube-api-access-bcr8m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh\" (UID: \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.689301 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh\" (UID: \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.694996 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh\" (UID: \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.696047 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh\" (UID: \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.707390 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcr8m\" (UniqueName: \"kubernetes.io/projected/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-kube-api-access-bcr8m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh\" (UID: \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.797825 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:20:39 crc kubenswrapper[4717]: I0217 15:20:39.867530 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3786e2-123b-4300-86fb-505483139531" path="/var/lib/kubelet/pods/5b3786e2-123b-4300-86fb-505483139531/volumes" Feb 17 15:20:40 crc kubenswrapper[4717]: I0217 15:20:40.394247 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh"] Feb 17 15:20:41 crc kubenswrapper[4717]: I0217 15:20:41.327195 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" event={"ID":"ce8505ef-a14d-4936-b9c2-4334b5cf69b1","Type":"ContainerStarted","Data":"4173bea06703835d3fd9c9ab98a856f13e588b999ef91d5d1f901b6ed06ec2b8"} Feb 17 15:20:41 crc kubenswrapper[4717]: I0217 15:20:41.327839 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" event={"ID":"ce8505ef-a14d-4936-b9c2-4334b5cf69b1","Type":"ContainerStarted","Data":"5bd51ba2b609225321d301c36c86f9bcc92bc7671ff5747e6e560cda5caeb93e"} Feb 17 15:20:41 crc kubenswrapper[4717]: I0217 15:20:41.352557 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" podStartSLOduration=1.713709502 podStartE2EDuration="2.352533794s" podCreationTimestamp="2026-02-17 15:20:39 +0000 UTC" firstStartedPulling="2026-02-17 15:20:40.393154879 +0000 UTC m=+1706.808995355" lastFinishedPulling="2026-02-17 15:20:41.031979171 +0000 UTC m=+1707.447819647" observedRunningTime="2026-02-17 15:20:41.346050159 +0000 UTC m=+1707.761890705" watchObservedRunningTime="2026-02-17 15:20:41.352533794 +0000 UTC m=+1707.768374290" Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.048673 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-dfdd-account-create-update-gm7bt"] Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.067638 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-h4rqv"] Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.085530 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e750-account-create-update-kv9rx"] Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.103304 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-h4rqv"] Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.111778 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-468e-account-create-update-pf2dw"] Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.123873 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-dfdd-account-create-update-gm7bt"] Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.131948 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e750-account-create-update-kv9rx"] Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.139912 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-468e-account-create-update-pf2dw"] Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.858698 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:20:45 crc kubenswrapper[4717]: E0217 15:20:45.859069 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.860531 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d66078-a470-4a86-92f9-b22752be660a" path="/var/lib/kubelet/pods/32d66078-a470-4a86-92f9-b22752be660a/volumes" Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.861275 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb161a2-7b80-400b-a4eb-ca14e4386721" path="/var/lib/kubelet/pods/4fb161a2-7b80-400b-a4eb-ca14e4386721/volumes" Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.861968 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9527cb55-cce6-4fb4-aa5c-b2495a6264f6" path="/var/lib/kubelet/pods/9527cb55-cce6-4fb4-aa5c-b2495a6264f6/volumes" Feb 17 15:20:45 crc kubenswrapper[4717]: I0217 15:20:45.862699 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0e668f2-e8a2-403b-af75-9ea042dad451" path="/var/lib/kubelet/pods/c0e668f2-e8a2-403b-af75-9ea042dad451/volumes" Feb 17 15:20:55 crc kubenswrapper[4717]: I0217 15:20:55.036563 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-v77mr"] Feb 17 15:20:55 crc kubenswrapper[4717]: I0217 15:20:55.047704 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-v77mr"] Feb 17 15:20:55 crc kubenswrapper[4717]: I0217 15:20:55.863213 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceff3034-f756-4cc9-9b21-3c38aed2b429" path="/var/lib/kubelet/pods/ceff3034-f756-4cc9-9b21-3c38aed2b429/volumes" Feb 17 15:20:56 crc kubenswrapper[4717]: I0217 15:20:56.059196 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-716d-account-create-update-npdx8"] Feb 17 15:20:56 crc kubenswrapper[4717]: I0217 15:20:56.073771 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-716d-account-create-update-npdx8"] Feb 17 15:20:57 crc kubenswrapper[4717]: I0217 15:20:57.868357 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf67c089-b17d-42bb-9cdf-3b5252b212c1" path="/var/lib/kubelet/pods/bf67c089-b17d-42bb-9cdf-3b5252b212c1/volumes" Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.074171 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-63ba-account-create-update-phs52"] Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.093025 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-27f0-account-create-update-59n4f"] Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.107598 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-63ba-account-create-update-phs52"] Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.122569 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-27f0-account-create-update-59n4f"] Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.483682 4717 scope.go:117] "RemoveContainer" containerID="5ca72881a3fd4b3fd84db74172df8d5fdd228172442a767e1fa068eb71eae143" Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.517283 4717 scope.go:117] "RemoveContainer" containerID="94173b96dd1944624e07d77926d4d98a3c750542042e65d34c958c10f0b69bf5" Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.587778 4717 scope.go:117] "RemoveContainer" containerID="3d6ba34bc3431518d9c7baeb3ed11e5dc4ddcaed3cc8df7d6b8381813a94d076" Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.668145 4717 scope.go:117] "RemoveContainer" containerID="17080c43a13839d51643cf8b58c0f9b6a44de0d4e29acd113c75e4b57e57a55b" Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.714211 4717 scope.go:117] "RemoveContainer" containerID="4c52b97c2f1f802fa203b77aa3e96251c4bfb84f3f0c1f41b12f110bdbce3bbb" Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.752669 4717 scope.go:117] "RemoveContainer" containerID="35d412548ce2232ec5c975027c9be64cea9e54fa413ebf26c9ba7a6b2ad5358f" Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.798763 4717 scope.go:117] "RemoveContainer" containerID="d8c75d7b87cd1ef5c0b5186198f05269382351e290640d8dc86e9ecc80ebdee6" Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.836117 4717 scope.go:117] "RemoveContainer" containerID="5a2e52a7881ad868f34911f20aaa2de38d025d14a3689528ca22d9d37e1c0e1f" Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.861200 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44668081-3deb-40f0-a60e-302ee0a8b85a" path="/var/lib/kubelet/pods/44668081-3deb-40f0-a60e-302ee0a8b85a/volumes" Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.861943 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0a6ff8-9436-409a-b86f-df23c821c302" path="/var/lib/kubelet/pods/7e0a6ff8-9436-409a-b86f-df23c821c302/volumes" Feb 17 15:20:59 crc kubenswrapper[4717]: I0217 15:20:59.869022 4717 scope.go:117] "RemoveContainer" containerID="13f05a509876ea4dab3270bff59fb10757072e420fe157431ed2b3f7024dc964" Feb 17 15:21:00 crc kubenswrapper[4717]: I0217 15:21:00.847383 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:21:00 crc kubenswrapper[4717]: E0217 15:21:00.848159 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:21:07 crc kubenswrapper[4717]: I0217 15:21:07.052927 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6b77v"] Feb 17 15:21:07 crc kubenswrapper[4717]: I0217 15:21:07.072295 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-vsfnm"] Feb 17 15:21:07 crc kubenswrapper[4717]: I0217 15:21:07.086129 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-vsfnm"] Feb 17 15:21:07 crc kubenswrapper[4717]: I0217 15:21:07.095901 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6b77v"] Feb 17 15:21:07 crc kubenswrapper[4717]: I0217 15:21:07.863067 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd826039-5737-4f09-b722-e6263c314341" path="/var/lib/kubelet/pods/bd826039-5737-4f09-b722-e6263c314341/volumes" Feb 17 15:21:07 crc kubenswrapper[4717]: I0217 15:21:07.864593 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e74a323a-cd4e-435f-beaa-9d3b6689e98c" path="/var/lib/kubelet/pods/e74a323a-cd4e-435f-beaa-9d3b6689e98c/volumes" Feb 17 15:21:15 crc kubenswrapper[4717]: I0217 15:21:15.041958 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kvdwd"] Feb 17 15:21:15 crc kubenswrapper[4717]: I0217 15:21:15.053032 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kvdwd"] Feb 17 15:21:15 crc kubenswrapper[4717]: I0217 15:21:15.852039 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:21:15 crc kubenswrapper[4717]: E0217 15:21:15.852583 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:21:15 crc kubenswrapper[4717]: I0217 15:21:15.856135 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f16717-cc42-4465-8a1e-b7377b11b987" path="/var/lib/kubelet/pods/35f16717-cc42-4465-8a1e-b7377b11b987/volumes" Feb 17 15:21:29 crc kubenswrapper[4717]: I0217 15:21:29.846897 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:21:29 crc kubenswrapper[4717]: E0217 15:21:29.847886 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:21:33 crc kubenswrapper[4717]: I0217 15:21:33.043531 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hvpfc"] Feb 17 15:21:33 crc kubenswrapper[4717]: I0217 15:21:33.054318 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hvpfc"] Feb 17 15:21:33 crc kubenswrapper[4717]: I0217 15:21:33.863973 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b96c066-9919-4133-93df-69c9abdc0c6c" path="/var/lib/kubelet/pods/0b96c066-9919-4133-93df-69c9abdc0c6c/volumes" Feb 17 15:21:43 crc kubenswrapper[4717]: I0217 15:21:43.847237 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:21:43 crc kubenswrapper[4717]: E0217 15:21:43.848389 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:21:51 crc kubenswrapper[4717]: I0217 15:21:51.113348 4717 generic.go:334] "Generic (PLEG): container finished" podID="ce8505ef-a14d-4936-b9c2-4334b5cf69b1" containerID="4173bea06703835d3fd9c9ab98a856f13e588b999ef91d5d1f901b6ed06ec2b8" exitCode=0 Feb 17 15:21:51 crc kubenswrapper[4717]: I0217 15:21:51.113485 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" event={"ID":"ce8505ef-a14d-4936-b9c2-4334b5cf69b1","Type":"ContainerDied","Data":"4173bea06703835d3fd9c9ab98a856f13e588b999ef91d5d1f901b6ed06ec2b8"} Feb 17 15:21:52 crc kubenswrapper[4717]: I0217 15:21:52.038844 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8zrk2"] Feb 17 15:21:52 crc kubenswrapper[4717]: I0217 15:21:52.053413 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8zrk2"] Feb 17 15:21:52 crc kubenswrapper[4717]: I0217 15:21:52.613945 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:21:52 crc kubenswrapper[4717]: I0217 15:21:52.753215 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-ssh-key-openstack-edpm-ipam\") pod \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\" (UID: \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\") " Feb 17 15:21:52 crc kubenswrapper[4717]: I0217 15:21:52.753309 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcr8m\" (UniqueName: \"kubernetes.io/projected/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-kube-api-access-bcr8m\") pod \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\" (UID: \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\") " Feb 17 15:21:52 crc kubenswrapper[4717]: I0217 15:21:52.753353 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-inventory\") pod \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\" (UID: \"ce8505ef-a14d-4936-b9c2-4334b5cf69b1\") " Feb 17 15:21:52 crc kubenswrapper[4717]: I0217 15:21:52.767350 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-kube-api-access-bcr8m" (OuterVolumeSpecName: "kube-api-access-bcr8m") pod "ce8505ef-a14d-4936-b9c2-4334b5cf69b1" (UID: "ce8505ef-a14d-4936-b9c2-4334b5cf69b1"). InnerVolumeSpecName "kube-api-access-bcr8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:21:52 crc kubenswrapper[4717]: I0217 15:21:52.781626 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-inventory" (OuterVolumeSpecName: "inventory") pod "ce8505ef-a14d-4936-b9c2-4334b5cf69b1" (UID: "ce8505ef-a14d-4936-b9c2-4334b5cf69b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:21:52 crc kubenswrapper[4717]: I0217 15:21:52.786937 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ce8505ef-a14d-4936-b9c2-4334b5cf69b1" (UID: "ce8505ef-a14d-4936-b9c2-4334b5cf69b1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:21:52 crc kubenswrapper[4717]: I0217 15:21:52.856297 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:21:52 crc kubenswrapper[4717]: I0217 15:21:52.856325 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:21:52 crc kubenswrapper[4717]: I0217 15:21:52.856337 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcr8m\" (UniqueName: \"kubernetes.io/projected/ce8505ef-a14d-4936-b9c2-4334b5cf69b1-kube-api-access-bcr8m\") on node \"crc\" DevicePath \"\"" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.133273 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" event={"ID":"ce8505ef-a14d-4936-b9c2-4334b5cf69b1","Type":"ContainerDied","Data":"5bd51ba2b609225321d301c36c86f9bcc92bc7671ff5747e6e560cda5caeb93e"} Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.133322 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bd51ba2b609225321d301c36c86f9bcc92bc7671ff5747e6e560cda5caeb93e" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.133744 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.243575 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf"] Feb 17 15:21:53 crc kubenswrapper[4717]: E0217 15:21:53.244177 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8505ef-a14d-4936-b9c2-4334b5cf69b1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.244208 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8505ef-a14d-4936-b9c2-4334b5cf69b1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.244410 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8505ef-a14d-4936-b9c2-4334b5cf69b1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.245244 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.247852 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.247970 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.248653 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.250039 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.256589 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf"] Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.372210 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dfba30c-cf0e-4165-bc37-8284cc15b50f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-frrhf\" (UID: \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.372553 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6687\" (UniqueName: \"kubernetes.io/projected/6dfba30c-cf0e-4165-bc37-8284cc15b50f-kube-api-access-x6687\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-frrhf\" (UID: \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.372722 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6dfba30c-cf0e-4165-bc37-8284cc15b50f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-frrhf\" (UID: \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.475436 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dfba30c-cf0e-4165-bc37-8284cc15b50f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-frrhf\" (UID: \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.475608 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6687\" (UniqueName: \"kubernetes.io/projected/6dfba30c-cf0e-4165-bc37-8284cc15b50f-kube-api-access-x6687\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-frrhf\" (UID: \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.475682 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6dfba30c-cf0e-4165-bc37-8284cc15b50f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-frrhf\" (UID: \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.482522 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dfba30c-cf0e-4165-bc37-8284cc15b50f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-frrhf\" (UID: \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.483865 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6dfba30c-cf0e-4165-bc37-8284cc15b50f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-frrhf\" (UID: \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.504512 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6687\" (UniqueName: \"kubernetes.io/projected/6dfba30c-cf0e-4165-bc37-8284cc15b50f-kube-api-access-x6687\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-frrhf\" (UID: \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.568279 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:21:53 crc kubenswrapper[4717]: I0217 15:21:53.859397 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="246d9375-0f70-4c31-ac82-63ecc1bdcd2b" path="/var/lib/kubelet/pods/246d9375-0f70-4c31-ac82-63ecc1bdcd2b/volumes" Feb 17 15:21:54 crc kubenswrapper[4717]: I0217 15:21:54.125966 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf"] Feb 17 15:21:54 crc kubenswrapper[4717]: I0217 15:21:54.147123 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" event={"ID":"6dfba30c-cf0e-4165-bc37-8284cc15b50f","Type":"ContainerStarted","Data":"fe017ca5cb1c81623af98c6af0b49157f56595d69f796882901acdb81ddb1720"} Feb 17 15:21:56 crc kubenswrapper[4717]: I0217 15:21:56.184405 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" event={"ID":"6dfba30c-cf0e-4165-bc37-8284cc15b50f","Type":"ContainerStarted","Data":"cc3b0b726ddb88a537eb0f366a38179c59d3840bd5b1c45b6b65f63e6c7b0427"} Feb 17 15:21:56 crc kubenswrapper[4717]: I0217 15:21:56.217577 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" podStartSLOduration=2.434937494 podStartE2EDuration="3.217545792s" podCreationTimestamp="2026-02-17 15:21:53 +0000 UTC" firstStartedPulling="2026-02-17 15:21:54.124014135 +0000 UTC m=+1780.539854621" lastFinishedPulling="2026-02-17 15:21:54.906622403 +0000 UTC m=+1781.322462919" observedRunningTime="2026-02-17 15:21:56.201729393 +0000 UTC m=+1782.617569909" watchObservedRunningTime="2026-02-17 15:21:56.217545792 +0000 UTC m=+1782.633386298" Feb 17 15:21:58 crc kubenswrapper[4717]: I0217 15:21:58.847033 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:21:58 crc kubenswrapper[4717]: E0217 15:21:58.847926 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:22:00 crc kubenswrapper[4717]: I0217 15:22:00.127728 4717 scope.go:117] "RemoveContainer" containerID="94b7e9b20e2fe255f7f53b2a331597b32c3774ac2a004642df653176a28a70b0" Feb 17 15:22:00 crc kubenswrapper[4717]: I0217 15:22:00.172157 4717 scope.go:117] "RemoveContainer" containerID="94672654284a0165636c6d9fe9de12d3bde736cb946139baf5afc940ef05d8b9" Feb 17 15:22:00 crc kubenswrapper[4717]: I0217 15:22:00.225721 4717 generic.go:334] "Generic (PLEG): container finished" podID="6dfba30c-cf0e-4165-bc37-8284cc15b50f" containerID="cc3b0b726ddb88a537eb0f366a38179c59d3840bd5b1c45b6b65f63e6c7b0427" exitCode=0 Feb 17 15:22:00 crc kubenswrapper[4717]: I0217 15:22:00.225799 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" event={"ID":"6dfba30c-cf0e-4165-bc37-8284cc15b50f","Type":"ContainerDied","Data":"cc3b0b726ddb88a537eb0f366a38179c59d3840bd5b1c45b6b65f63e6c7b0427"} Feb 17 15:22:00 crc kubenswrapper[4717]: I0217 15:22:00.240758 4717 scope.go:117] "RemoveContainer" containerID="5e39d789f33c6404226590e54b1fffc1b8489430ebf7ad53c4a39530fdcd4a6d" Feb 17 15:22:00 crc kubenswrapper[4717]: I0217 15:22:00.306969 4717 scope.go:117] "RemoveContainer" containerID="0fbede69481c75260fa48a7c223c63adac345952e5dbf8816b3cacc2d8a97fab" Feb 17 15:22:00 crc kubenswrapper[4717]: I0217 15:22:00.331810 4717 scope.go:117] "RemoveContainer" containerID="ec74d17aa82c8980ca1c5457c5c15d213e2abfa96f837f151625e09426ab161a" Feb 17 15:22:00 crc kubenswrapper[4717]: I0217 15:22:00.391903 4717 scope.go:117] "RemoveContainer" containerID="8530531676be825a0cc4aec8a3b4f9dcc05d36a222dad3ae51e91e2760f22dab" Feb 17 15:22:00 crc kubenswrapper[4717]: I0217 15:22:00.415457 4717 scope.go:117] "RemoveContainer" containerID="c15c6278b2ff88b6bdc31bb1c6e474cadfe194c14c9d6f908a15298bdf9ea965" Feb 17 15:22:01 crc kubenswrapper[4717]: I0217 15:22:01.748125 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:22:01 crc kubenswrapper[4717]: I0217 15:22:01.855158 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6dfba30c-cf0e-4165-bc37-8284cc15b50f-ssh-key-openstack-edpm-ipam\") pod \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\" (UID: \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\") " Feb 17 15:22:01 crc kubenswrapper[4717]: I0217 15:22:01.855256 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dfba30c-cf0e-4165-bc37-8284cc15b50f-inventory\") pod \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\" (UID: \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\") " Feb 17 15:22:01 crc kubenswrapper[4717]: I0217 15:22:01.855500 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6687\" (UniqueName: \"kubernetes.io/projected/6dfba30c-cf0e-4165-bc37-8284cc15b50f-kube-api-access-x6687\") pod \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\" (UID: \"6dfba30c-cf0e-4165-bc37-8284cc15b50f\") " Feb 17 15:22:01 crc kubenswrapper[4717]: I0217 15:22:01.866917 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfba30c-cf0e-4165-bc37-8284cc15b50f-kube-api-access-x6687" (OuterVolumeSpecName: "kube-api-access-x6687") pod "6dfba30c-cf0e-4165-bc37-8284cc15b50f" (UID: "6dfba30c-cf0e-4165-bc37-8284cc15b50f"). InnerVolumeSpecName "kube-api-access-x6687". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:22:01 crc kubenswrapper[4717]: I0217 15:22:01.910307 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfba30c-cf0e-4165-bc37-8284cc15b50f-inventory" (OuterVolumeSpecName: "inventory") pod "6dfba30c-cf0e-4165-bc37-8284cc15b50f" (UID: "6dfba30c-cf0e-4165-bc37-8284cc15b50f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:22:01 crc kubenswrapper[4717]: I0217 15:22:01.912373 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfba30c-cf0e-4165-bc37-8284cc15b50f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6dfba30c-cf0e-4165-bc37-8284cc15b50f" (UID: "6dfba30c-cf0e-4165-bc37-8284cc15b50f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:22:01 crc kubenswrapper[4717]: I0217 15:22:01.957679 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6687\" (UniqueName: \"kubernetes.io/projected/6dfba30c-cf0e-4165-bc37-8284cc15b50f-kube-api-access-x6687\") on node \"crc\" DevicePath \"\"" Feb 17 15:22:01 crc kubenswrapper[4717]: I0217 15:22:01.957711 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6dfba30c-cf0e-4165-bc37-8284cc15b50f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:22:01 crc kubenswrapper[4717]: I0217 15:22:01.957722 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dfba30c-cf0e-4165-bc37-8284cc15b50f-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.254767 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" event={"ID":"6dfba30c-cf0e-4165-bc37-8284cc15b50f","Type":"ContainerDied","Data":"fe017ca5cb1c81623af98c6af0b49157f56595d69f796882901acdb81ddb1720"} Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.254825 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe017ca5cb1c81623af98c6af0b49157f56595d69f796882901acdb81ddb1720" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.254855 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-frrhf" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.370764 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p"] Feb 17 15:22:02 crc kubenswrapper[4717]: E0217 15:22:02.371205 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfba30c-cf0e-4165-bc37-8284cc15b50f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.371226 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfba30c-cf0e-4165-bc37-8284cc15b50f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.371462 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfba30c-cf0e-4165-bc37-8284cc15b50f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.372167 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.375793 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.376380 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.376525 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.376605 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.383981 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p"] Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.572497 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs6w8\" (UniqueName: \"kubernetes.io/projected/7b728f97-9d73-433e-a910-13591303221e-kube-api-access-cs6w8\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mgr9p\" (UID: \"7b728f97-9d73-433e-a910-13591303221e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.572575 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b728f97-9d73-433e-a910-13591303221e-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mgr9p\" (UID: \"7b728f97-9d73-433e-a910-13591303221e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.572674 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b728f97-9d73-433e-a910-13591303221e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mgr9p\" (UID: \"7b728f97-9d73-433e-a910-13591303221e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.674158 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs6w8\" (UniqueName: \"kubernetes.io/projected/7b728f97-9d73-433e-a910-13591303221e-kube-api-access-cs6w8\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mgr9p\" (UID: \"7b728f97-9d73-433e-a910-13591303221e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.674239 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b728f97-9d73-433e-a910-13591303221e-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mgr9p\" (UID: \"7b728f97-9d73-433e-a910-13591303221e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.674329 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b728f97-9d73-433e-a910-13591303221e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mgr9p\" (UID: \"7b728f97-9d73-433e-a910-13591303221e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.680938 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b728f97-9d73-433e-a910-13591303221e-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mgr9p\" (UID: \"7b728f97-9d73-433e-a910-13591303221e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.681679 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b728f97-9d73-433e-a910-13591303221e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mgr9p\" (UID: \"7b728f97-9d73-433e-a910-13591303221e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.693506 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs6w8\" (UniqueName: \"kubernetes.io/projected/7b728f97-9d73-433e-a910-13591303221e-kube-api-access-cs6w8\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mgr9p\" (UID: \"7b728f97-9d73-433e-a910-13591303221e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:02 crc kubenswrapper[4717]: I0217 15:22:02.700242 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:03 crc kubenswrapper[4717]: I0217 15:22:03.063942 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p"] Feb 17 15:22:03 crc kubenswrapper[4717]: I0217 15:22:03.280372 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" event={"ID":"7b728f97-9d73-433e-a910-13591303221e","Type":"ContainerStarted","Data":"e7905966db214cb89bd7b22b73778c5a16813efebb3ffb9d4b5aadd9f22e3fcb"} Feb 17 15:22:04 crc kubenswrapper[4717]: I0217 15:22:04.297672 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" event={"ID":"7b728f97-9d73-433e-a910-13591303221e","Type":"ContainerStarted","Data":"67e2dc531d017a2a6aa8acea8cc0533f72fe48f75bc962e07b082218f74c0f4f"} Feb 17 15:22:04 crc kubenswrapper[4717]: I0217 15:22:04.320377 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" podStartSLOduration=1.910714983 podStartE2EDuration="2.320345848s" podCreationTimestamp="2026-02-17 15:22:02 +0000 UTC" firstStartedPulling="2026-02-17 15:22:03.069984951 +0000 UTC m=+1789.485825437" lastFinishedPulling="2026-02-17 15:22:03.479615816 +0000 UTC m=+1789.895456302" observedRunningTime="2026-02-17 15:22:04.320032479 +0000 UTC m=+1790.735873045" watchObservedRunningTime="2026-02-17 15:22:04.320345848 +0000 UTC m=+1790.736186364" Feb 17 15:22:07 crc kubenswrapper[4717]: I0217 15:22:07.051335 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fg4vl"] Feb 17 15:22:07 crc kubenswrapper[4717]: I0217 15:22:07.060940 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fg4vl"] Feb 17 15:22:07 crc kubenswrapper[4717]: I0217 15:22:07.866685 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7055a012-2f5d-4ba2-b56d-a9ec73e11944" path="/var/lib/kubelet/pods/7055a012-2f5d-4ba2-b56d-a9ec73e11944/volumes" Feb 17 15:22:08 crc kubenswrapper[4717]: I0217 15:22:08.045124 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gxq6g"] Feb 17 15:22:08 crc kubenswrapper[4717]: I0217 15:22:08.056171 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-6ncb2"] Feb 17 15:22:08 crc kubenswrapper[4717]: I0217 15:22:08.065969 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-6ncb2"] Feb 17 15:22:08 crc kubenswrapper[4717]: I0217 15:22:08.074815 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gxq6g"] Feb 17 15:22:09 crc kubenswrapper[4717]: I0217 15:22:09.863935 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de3759e2-5565-40bd-8b02-f3c1f0d55863" path="/var/lib/kubelet/pods/de3759e2-5565-40bd-8b02-f3c1f0d55863/volumes" Feb 17 15:22:09 crc kubenswrapper[4717]: I0217 15:22:09.866321 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8aa40e-4ead-46b8-a94b-0a3602d030ef" path="/var/lib/kubelet/pods/de8aa40e-4ead-46b8-a94b-0a3602d030ef/volumes" Feb 17 15:22:10 crc kubenswrapper[4717]: I0217 15:22:10.847511 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:22:10 crc kubenswrapper[4717]: E0217 15:22:10.847950 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:22:23 crc kubenswrapper[4717]: I0217 15:22:23.851479 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:22:23 crc kubenswrapper[4717]: E0217 15:22:23.852737 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:22:24 crc kubenswrapper[4717]: I0217 15:22:24.041777 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-s5lzg"] Feb 17 15:22:24 crc kubenswrapper[4717]: I0217 15:22:24.061984 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-s5lzg"] Feb 17 15:22:25 crc kubenswrapper[4717]: I0217 15:22:25.892512 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9c9596c-6124-44d0-b06b-a99477938b79" path="/var/lib/kubelet/pods/b9c9596c-6124-44d0-b06b-a99477938b79/volumes" Feb 17 15:22:38 crc kubenswrapper[4717]: I0217 15:22:38.846298 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:22:38 crc kubenswrapper[4717]: E0217 15:22:38.846986 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:22:40 crc kubenswrapper[4717]: I0217 15:22:40.734880 4717 generic.go:334] "Generic (PLEG): container finished" podID="7b728f97-9d73-433e-a910-13591303221e" containerID="67e2dc531d017a2a6aa8acea8cc0533f72fe48f75bc962e07b082218f74c0f4f" exitCode=0 Feb 17 15:22:40 crc kubenswrapper[4717]: I0217 15:22:40.734983 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" event={"ID":"7b728f97-9d73-433e-a910-13591303221e","Type":"ContainerDied","Data":"67e2dc531d017a2a6aa8acea8cc0533f72fe48f75bc962e07b082218f74c0f4f"} Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.260022 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.334101 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b728f97-9d73-433e-a910-13591303221e-ssh-key-openstack-edpm-ipam\") pod \"7b728f97-9d73-433e-a910-13591303221e\" (UID: \"7b728f97-9d73-433e-a910-13591303221e\") " Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.334316 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b728f97-9d73-433e-a910-13591303221e-inventory\") pod \"7b728f97-9d73-433e-a910-13591303221e\" (UID: \"7b728f97-9d73-433e-a910-13591303221e\") " Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.334380 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs6w8\" (UniqueName: \"kubernetes.io/projected/7b728f97-9d73-433e-a910-13591303221e-kube-api-access-cs6w8\") pod \"7b728f97-9d73-433e-a910-13591303221e\" (UID: \"7b728f97-9d73-433e-a910-13591303221e\") " Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.340630 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b728f97-9d73-433e-a910-13591303221e-kube-api-access-cs6w8" (OuterVolumeSpecName: "kube-api-access-cs6w8") pod "7b728f97-9d73-433e-a910-13591303221e" (UID: "7b728f97-9d73-433e-a910-13591303221e"). InnerVolumeSpecName "kube-api-access-cs6w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.361439 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b728f97-9d73-433e-a910-13591303221e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7b728f97-9d73-433e-a910-13591303221e" (UID: "7b728f97-9d73-433e-a910-13591303221e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.372741 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b728f97-9d73-433e-a910-13591303221e-inventory" (OuterVolumeSpecName: "inventory") pod "7b728f97-9d73-433e-a910-13591303221e" (UID: "7b728f97-9d73-433e-a910-13591303221e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.436778 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b728f97-9d73-433e-a910-13591303221e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.436817 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b728f97-9d73-433e-a910-13591303221e-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.436832 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs6w8\" (UniqueName: \"kubernetes.io/projected/7b728f97-9d73-433e-a910-13591303221e-kube-api-access-cs6w8\") on node \"crc\" DevicePath \"\"" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.760149 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" event={"ID":"7b728f97-9d73-433e-a910-13591303221e","Type":"ContainerDied","Data":"e7905966db214cb89bd7b22b73778c5a16813efebb3ffb9d4b5aadd9f22e3fcb"} Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.760337 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7905966db214cb89bd7b22b73778c5a16813efebb3ffb9d4b5aadd9f22e3fcb" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.760200 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mgr9p" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.884290 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4"] Feb 17 15:22:42 crc kubenswrapper[4717]: E0217 15:22:42.884653 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b728f97-9d73-433e-a910-13591303221e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.884671 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b728f97-9d73-433e-a910-13591303221e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.884831 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b728f97-9d73-433e-a910-13591303221e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.885427 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.887444 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.888415 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.888435 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.895593 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.896602 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4"] Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.951427 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0c4edef-fee0-490a-8c25-9e4c9950c04f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4\" (UID: \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.951597 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0c4edef-fee0-490a-8c25-9e4c9950c04f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4\" (UID: \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:22:42 crc kubenswrapper[4717]: I0217 15:22:42.951798 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc7df\" (UniqueName: \"kubernetes.io/projected/d0c4edef-fee0-490a-8c25-9e4c9950c04f-kube-api-access-gc7df\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4\" (UID: \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:22:43 crc kubenswrapper[4717]: I0217 15:22:43.054070 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc7df\" (UniqueName: \"kubernetes.io/projected/d0c4edef-fee0-490a-8c25-9e4c9950c04f-kube-api-access-gc7df\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4\" (UID: \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:22:43 crc kubenswrapper[4717]: I0217 15:22:43.054185 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0c4edef-fee0-490a-8c25-9e4c9950c04f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4\" (UID: \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:22:43 crc kubenswrapper[4717]: I0217 15:22:43.054283 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0c4edef-fee0-490a-8c25-9e4c9950c04f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4\" (UID: \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:22:43 crc kubenswrapper[4717]: I0217 15:22:43.057877 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0c4edef-fee0-490a-8c25-9e4c9950c04f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4\" (UID: \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:22:43 crc kubenswrapper[4717]: I0217 15:22:43.058202 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0c4edef-fee0-490a-8c25-9e4c9950c04f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4\" (UID: \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:22:43 crc kubenswrapper[4717]: I0217 15:22:43.070275 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc7df\" (UniqueName: \"kubernetes.io/projected/d0c4edef-fee0-490a-8c25-9e4c9950c04f-kube-api-access-gc7df\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4\" (UID: \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:22:43 crc kubenswrapper[4717]: I0217 15:22:43.257040 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:22:43 crc kubenswrapper[4717]: I0217 15:22:43.637272 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4"] Feb 17 15:22:43 crc kubenswrapper[4717]: W0217 15:22:43.642976 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0c4edef_fee0_490a_8c25_9e4c9950c04f.slice/crio-f58c9af29db59fe84f7cc2292e8db9f2e247224231a569f555b0bd08d4747e3c WatchSource:0}: Error finding container f58c9af29db59fe84f7cc2292e8db9f2e247224231a569f555b0bd08d4747e3c: Status 404 returned error can't find the container with id f58c9af29db59fe84f7cc2292e8db9f2e247224231a569f555b0bd08d4747e3c Feb 17 15:22:43 crc kubenswrapper[4717]: I0217 15:22:43.772653 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" event={"ID":"d0c4edef-fee0-490a-8c25-9e4c9950c04f","Type":"ContainerStarted","Data":"f58c9af29db59fe84f7cc2292e8db9f2e247224231a569f555b0bd08d4747e3c"} Feb 17 15:22:44 crc kubenswrapper[4717]: I0217 15:22:44.784582 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" event={"ID":"d0c4edef-fee0-490a-8c25-9e4c9950c04f","Type":"ContainerStarted","Data":"3ada1e6ae71afdc186de3f5b538929f0d45b8a2de8924ef44af6bfa9e571bfa5"} Feb 17 15:22:44 crc kubenswrapper[4717]: I0217 15:22:44.802370 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" podStartSLOduration=2.245513125 podStartE2EDuration="2.802350095s" podCreationTimestamp="2026-02-17 15:22:42 +0000 UTC" firstStartedPulling="2026-02-17 15:22:43.647202296 +0000 UTC m=+1830.063042772" lastFinishedPulling="2026-02-17 15:22:44.204039266 +0000 UTC m=+1830.619879742" observedRunningTime="2026-02-17 15:22:44.800632646 +0000 UTC m=+1831.216473152" watchObservedRunningTime="2026-02-17 15:22:44.802350095 +0000 UTC m=+1831.218190581" Feb 17 15:22:49 crc kubenswrapper[4717]: I0217 15:22:49.847792 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:22:49 crc kubenswrapper[4717]: E0217 15:22:49.850280 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:22:58 crc kubenswrapper[4717]: I0217 15:22:58.061890 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-bc18-account-create-update-qs9qz"] Feb 17 15:22:58 crc kubenswrapper[4717]: I0217 15:22:58.072135 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xzx8s"] Feb 17 15:22:58 crc kubenswrapper[4717]: I0217 15:22:58.082028 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9xpzr"] Feb 17 15:22:58 crc kubenswrapper[4717]: I0217 15:22:58.091733 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c075-account-create-update-d8f2k"] Feb 17 15:22:58 crc kubenswrapper[4717]: I0217 15:22:58.098953 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-bc18-account-create-update-qs9qz"] Feb 17 15:22:58 crc kubenswrapper[4717]: I0217 15:22:58.105650 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9xpzr"] Feb 17 15:22:58 crc kubenswrapper[4717]: I0217 15:22:58.112113 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-6nlkw"] Feb 17 15:22:58 crc kubenswrapper[4717]: I0217 15:22:58.118659 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xzx8s"] Feb 17 15:22:58 crc kubenswrapper[4717]: I0217 15:22:58.125827 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c075-account-create-update-d8f2k"] Feb 17 15:22:58 crc kubenswrapper[4717]: I0217 15:22:58.132369 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-6nlkw"] Feb 17 15:22:59 crc kubenswrapper[4717]: I0217 15:22:59.042939 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7260-account-create-update-wtt26"] Feb 17 15:22:59 crc kubenswrapper[4717]: I0217 15:22:59.061583 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7260-account-create-update-wtt26"] Feb 17 15:22:59 crc kubenswrapper[4717]: I0217 15:22:59.856944 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="062a7610-89de-4af4-a35d-b965eff08320" path="/var/lib/kubelet/pods/062a7610-89de-4af4-a35d-b965eff08320/volumes" Feb 17 15:22:59 crc kubenswrapper[4717]: I0217 15:22:59.857912 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47393b1e-7c8f-491e-a25a-0ef15e7eef3d" path="/var/lib/kubelet/pods/47393b1e-7c8f-491e-a25a-0ef15e7eef3d/volumes" Feb 17 15:22:59 crc kubenswrapper[4717]: I0217 15:22:59.858775 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd76b94-c071-4cae-8a3b-f1ef5a3ed700" path="/var/lib/kubelet/pods/4dd76b94-c071-4cae-8a3b-f1ef5a3ed700/volumes" Feb 17 15:22:59 crc kubenswrapper[4717]: I0217 15:22:59.859424 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f67dea-44d1-487d-9981-d758db840579" path="/var/lib/kubelet/pods/70f67dea-44d1-487d-9981-d758db840579/volumes" Feb 17 15:22:59 crc kubenswrapper[4717]: I0217 15:22:59.859969 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7970d67a-9fb0-493c-a143-bc2fee1d4c08" path="/var/lib/kubelet/pods/7970d67a-9fb0-493c-a143-bc2fee1d4c08/volumes" Feb 17 15:22:59 crc kubenswrapper[4717]: I0217 15:22:59.860568 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02c1955-1938-4cac-b38e-e1eec7332813" path="/var/lib/kubelet/pods/d02c1955-1938-4cac-b38e-e1eec7332813/volumes" Feb 17 15:23:00 crc kubenswrapper[4717]: I0217 15:23:00.589791 4717 scope.go:117] "RemoveContainer" containerID="b7ce7642ceea57a15c95230a894cb82d9521e86a44a2c27a5c55ef79a0b1a0dc" Feb 17 15:23:00 crc kubenswrapper[4717]: I0217 15:23:00.640778 4717 scope.go:117] "RemoveContainer" containerID="f33c50edd36ecabb200538b62fe7adc36e1722f79d2e1b875b1b36359e991b96" Feb 17 15:23:00 crc kubenswrapper[4717]: I0217 15:23:00.697521 4717 scope.go:117] "RemoveContainer" containerID="ebb33749d4e4dad6fa211bddb69de895fd6053aac3712d7fc8155a6bf83ada93" Feb 17 15:23:00 crc kubenswrapper[4717]: I0217 15:23:00.779061 4717 scope.go:117] "RemoveContainer" containerID="23b0af1a8bdcb765fc916a7eeaabd1df7b569556c9185090f4dbdfde1ed35f1d" Feb 17 15:23:00 crc kubenswrapper[4717]: I0217 15:23:00.813853 4717 scope.go:117] "RemoveContainer" containerID="649701b8f1b2cab221feb3d8ed2d837aee0b938775799a8853384119a3b8a680" Feb 17 15:23:00 crc kubenswrapper[4717]: I0217 15:23:00.861836 4717 scope.go:117] "RemoveContainer" containerID="967062522ba9d55030f87bbfbcaf4f9488d1ddd65c5f0052dcc040e5f828b911" Feb 17 15:23:00 crc kubenswrapper[4717]: I0217 15:23:00.909426 4717 scope.go:117] "RemoveContainer" containerID="8f78b1d79b5cc9a34466bc0d944fec17c7c33792639de76621b324984fd047b1" Feb 17 15:23:00 crc kubenswrapper[4717]: I0217 15:23:00.926169 4717 scope.go:117] "RemoveContainer" containerID="3434890440c8c96db0a0ff40d92887735467f5ec9d22d7e6d1dcdda7223f0922" Feb 17 15:23:00 crc kubenswrapper[4717]: I0217 15:23:00.945913 4717 scope.go:117] "RemoveContainer" containerID="64148e567539ee53dc679886de02850eb725d2b861ae53a9715e8804030baf57" Feb 17 15:23:00 crc kubenswrapper[4717]: I0217 15:23:00.993098 4717 scope.go:117] "RemoveContainer" containerID="0b48c1cea0ca4bad7a37163fe233403ebe72d23f69a6d0a7a5a08762638b0dd0" Feb 17 15:23:04 crc kubenswrapper[4717]: I0217 15:23:04.847219 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:23:04 crc kubenswrapper[4717]: E0217 15:23:04.848195 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:23:15 crc kubenswrapper[4717]: I0217 15:23:15.852977 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:23:15 crc kubenswrapper[4717]: E0217 15:23:15.853871 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:23:27 crc kubenswrapper[4717]: I0217 15:23:27.846982 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:23:27 crc kubenswrapper[4717]: E0217 15:23:27.848033 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:23:30 crc kubenswrapper[4717]: I0217 15:23:30.052374 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-24w4v"] Feb 17 15:23:30 crc kubenswrapper[4717]: I0217 15:23:30.064994 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-24w4v"] Feb 17 15:23:31 crc kubenswrapper[4717]: I0217 15:23:31.860426 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2692b981-8aba-4b0e-b25c-d53a5846e272" path="/var/lib/kubelet/pods/2692b981-8aba-4b0e-b25c-d53a5846e272/volumes" Feb 17 15:23:34 crc kubenswrapper[4717]: I0217 15:23:34.358215 4717 generic.go:334] "Generic (PLEG): container finished" podID="d0c4edef-fee0-490a-8c25-9e4c9950c04f" containerID="3ada1e6ae71afdc186de3f5b538929f0d45b8a2de8924ef44af6bfa9e571bfa5" exitCode=0 Feb 17 15:23:34 crc kubenswrapper[4717]: I0217 15:23:34.358292 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" event={"ID":"d0c4edef-fee0-490a-8c25-9e4c9950c04f","Type":"ContainerDied","Data":"3ada1e6ae71afdc186de3f5b538929f0d45b8a2de8924ef44af6bfa9e571bfa5"} Feb 17 15:23:35 crc kubenswrapper[4717]: I0217 15:23:35.793864 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:23:35 crc kubenswrapper[4717]: I0217 15:23:35.983824 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc7df\" (UniqueName: \"kubernetes.io/projected/d0c4edef-fee0-490a-8c25-9e4c9950c04f-kube-api-access-gc7df\") pod \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\" (UID: \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\") " Feb 17 15:23:35 crc kubenswrapper[4717]: I0217 15:23:35.984026 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0c4edef-fee0-490a-8c25-9e4c9950c04f-ssh-key-openstack-edpm-ipam\") pod \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\" (UID: \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\") " Feb 17 15:23:35 crc kubenswrapper[4717]: I0217 15:23:35.984223 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0c4edef-fee0-490a-8c25-9e4c9950c04f-inventory\") pod \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\" (UID: \"d0c4edef-fee0-490a-8c25-9e4c9950c04f\") " Feb 17 15:23:35 crc kubenswrapper[4717]: I0217 15:23:35.997456 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c4edef-fee0-490a-8c25-9e4c9950c04f-kube-api-access-gc7df" (OuterVolumeSpecName: "kube-api-access-gc7df") pod "d0c4edef-fee0-490a-8c25-9e4c9950c04f" (UID: "d0c4edef-fee0-490a-8c25-9e4c9950c04f"). InnerVolumeSpecName "kube-api-access-gc7df". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.009038 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c4edef-fee0-490a-8c25-9e4c9950c04f-inventory" (OuterVolumeSpecName: "inventory") pod "d0c4edef-fee0-490a-8c25-9e4c9950c04f" (UID: "d0c4edef-fee0-490a-8c25-9e4c9950c04f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.032083 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c4edef-fee0-490a-8c25-9e4c9950c04f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d0c4edef-fee0-490a-8c25-9e4c9950c04f" (UID: "d0c4edef-fee0-490a-8c25-9e4c9950c04f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.097908 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc7df\" (UniqueName: \"kubernetes.io/projected/d0c4edef-fee0-490a-8c25-9e4c9950c04f-kube-api-access-gc7df\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.097959 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0c4edef-fee0-490a-8c25-9e4c9950c04f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.097978 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0c4edef-fee0-490a-8c25-9e4c9950c04f-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.388365 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" event={"ID":"d0c4edef-fee0-490a-8c25-9e4c9950c04f","Type":"ContainerDied","Data":"f58c9af29db59fe84f7cc2292e8db9f2e247224231a569f555b0bd08d4747e3c"} Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.388427 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f58c9af29db59fe84f7cc2292e8db9f2e247224231a569f555b0bd08d4747e3c" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.388495 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.594763 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-6rtxk"] Feb 17 15:23:36 crc kubenswrapper[4717]: E0217 15:23:36.595261 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c4edef-fee0-490a-8c25-9e4c9950c04f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.595284 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c4edef-fee0-490a-8c25-9e4c9950c04f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.595500 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c4edef-fee0-490a-8c25-9e4c9950c04f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.596265 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.599650 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.601160 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.601406 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.602291 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.618249 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-6rtxk"] Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.707617 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/205b31b1-1e73-466f-9ede-0248217b4356-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-6rtxk\" (UID: \"205b31b1-1e73-466f-9ede-0248217b4356\") " pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.707730 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6m29\" (UniqueName: \"kubernetes.io/projected/205b31b1-1e73-466f-9ede-0248217b4356-kube-api-access-m6m29\") pod \"ssh-known-hosts-edpm-deployment-6rtxk\" (UID: \"205b31b1-1e73-466f-9ede-0248217b4356\") " pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.707832 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/205b31b1-1e73-466f-9ede-0248217b4356-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-6rtxk\" (UID: \"205b31b1-1e73-466f-9ede-0248217b4356\") " pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.809456 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/205b31b1-1e73-466f-9ede-0248217b4356-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-6rtxk\" (UID: \"205b31b1-1e73-466f-9ede-0248217b4356\") " pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.810099 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/205b31b1-1e73-466f-9ede-0248217b4356-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-6rtxk\" (UID: \"205b31b1-1e73-466f-9ede-0248217b4356\") " pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.810224 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6m29\" (UniqueName: \"kubernetes.io/projected/205b31b1-1e73-466f-9ede-0248217b4356-kube-api-access-m6m29\") pod \"ssh-known-hosts-edpm-deployment-6rtxk\" (UID: \"205b31b1-1e73-466f-9ede-0248217b4356\") " pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.817407 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/205b31b1-1e73-466f-9ede-0248217b4356-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-6rtxk\" (UID: \"205b31b1-1e73-466f-9ede-0248217b4356\") " pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.823124 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/205b31b1-1e73-466f-9ede-0248217b4356-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-6rtxk\" (UID: \"205b31b1-1e73-466f-9ede-0248217b4356\") " pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.831354 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6m29\" (UniqueName: \"kubernetes.io/projected/205b31b1-1e73-466f-9ede-0248217b4356-kube-api-access-m6m29\") pod \"ssh-known-hosts-edpm-deployment-6rtxk\" (UID: \"205b31b1-1e73-466f-9ede-0248217b4356\") " pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:36 crc kubenswrapper[4717]: I0217 15:23:36.925984 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:37 crc kubenswrapper[4717]: I0217 15:23:37.453795 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-6rtxk"] Feb 17 15:23:38 crc kubenswrapper[4717]: I0217 15:23:38.415881 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" event={"ID":"205b31b1-1e73-466f-9ede-0248217b4356","Type":"ContainerStarted","Data":"889f753417fac982c2e3cf622430ab5310400cd479b4b32419b34ef03484e660"} Feb 17 15:23:38 crc kubenswrapper[4717]: I0217 15:23:38.416952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" event={"ID":"205b31b1-1e73-466f-9ede-0248217b4356","Type":"ContainerStarted","Data":"9cd9bbccae8e558f467d15f8f878582fc3ef88dab63e691f5567fdc9f78c4051"} Feb 17 15:23:38 crc kubenswrapper[4717]: I0217 15:23:38.449926 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" podStartSLOduration=1.926333351 podStartE2EDuration="2.44990016s" podCreationTimestamp="2026-02-17 15:23:36 +0000 UTC" firstStartedPulling="2026-02-17 15:23:37.462483776 +0000 UTC m=+1883.878324252" lastFinishedPulling="2026-02-17 15:23:37.986050555 +0000 UTC m=+1884.401891061" observedRunningTime="2026-02-17 15:23:38.440750251 +0000 UTC m=+1884.856590757" watchObservedRunningTime="2026-02-17 15:23:38.44990016 +0000 UTC m=+1884.865740646" Feb 17 15:23:41 crc kubenswrapper[4717]: I0217 15:23:41.846774 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:23:41 crc kubenswrapper[4717]: E0217 15:23:41.847604 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:23:45 crc kubenswrapper[4717]: I0217 15:23:45.485072 4717 generic.go:334] "Generic (PLEG): container finished" podID="205b31b1-1e73-466f-9ede-0248217b4356" containerID="889f753417fac982c2e3cf622430ab5310400cd479b4b32419b34ef03484e660" exitCode=0 Feb 17 15:23:45 crc kubenswrapper[4717]: I0217 15:23:45.485176 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" event={"ID":"205b31b1-1e73-466f-9ede-0248217b4356","Type":"ContainerDied","Data":"889f753417fac982c2e3cf622430ab5310400cd479b4b32419b34ef03484e660"} Feb 17 15:23:46 crc kubenswrapper[4717]: I0217 15:23:46.899922 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:46 crc kubenswrapper[4717]: I0217 15:23:46.985717 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/205b31b1-1e73-466f-9ede-0248217b4356-inventory-0\") pod \"205b31b1-1e73-466f-9ede-0248217b4356\" (UID: \"205b31b1-1e73-466f-9ede-0248217b4356\") " Feb 17 15:23:46 crc kubenswrapper[4717]: I0217 15:23:46.985839 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6m29\" (UniqueName: \"kubernetes.io/projected/205b31b1-1e73-466f-9ede-0248217b4356-kube-api-access-m6m29\") pod \"205b31b1-1e73-466f-9ede-0248217b4356\" (UID: \"205b31b1-1e73-466f-9ede-0248217b4356\") " Feb 17 15:23:46 crc kubenswrapper[4717]: I0217 15:23:46.985890 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/205b31b1-1e73-466f-9ede-0248217b4356-ssh-key-openstack-edpm-ipam\") pod \"205b31b1-1e73-466f-9ede-0248217b4356\" (UID: \"205b31b1-1e73-466f-9ede-0248217b4356\") " Feb 17 15:23:46 crc kubenswrapper[4717]: I0217 15:23:46.992180 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/205b31b1-1e73-466f-9ede-0248217b4356-kube-api-access-m6m29" (OuterVolumeSpecName: "kube-api-access-m6m29") pod "205b31b1-1e73-466f-9ede-0248217b4356" (UID: "205b31b1-1e73-466f-9ede-0248217b4356"). InnerVolumeSpecName "kube-api-access-m6m29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.018840 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205b31b1-1e73-466f-9ede-0248217b4356-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "205b31b1-1e73-466f-9ede-0248217b4356" (UID: "205b31b1-1e73-466f-9ede-0248217b4356"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.026556 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205b31b1-1e73-466f-9ede-0248217b4356-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "205b31b1-1e73-466f-9ede-0248217b4356" (UID: "205b31b1-1e73-466f-9ede-0248217b4356"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.088171 4717 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/205b31b1-1e73-466f-9ede-0248217b4356-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.088200 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6m29\" (UniqueName: \"kubernetes.io/projected/205b31b1-1e73-466f-9ede-0248217b4356-kube-api-access-m6m29\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.088211 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/205b31b1-1e73-466f-9ede-0248217b4356-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.503110 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" event={"ID":"205b31b1-1e73-466f-9ede-0248217b4356","Type":"ContainerDied","Data":"9cd9bbccae8e558f467d15f8f878582fc3ef88dab63e691f5567fdc9f78c4051"} Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.503155 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cd9bbccae8e558f467d15f8f878582fc3ef88dab63e691f5567fdc9f78c4051" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.503195 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6rtxk" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.594378 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h"] Feb 17 15:23:47 crc kubenswrapper[4717]: E0217 15:23:47.594732 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205b31b1-1e73-466f-9ede-0248217b4356" containerName="ssh-known-hosts-edpm-deployment" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.594748 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="205b31b1-1e73-466f-9ede-0248217b4356" containerName="ssh-known-hosts-edpm-deployment" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.594901 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="205b31b1-1e73-466f-9ede-0248217b4356" containerName="ssh-known-hosts-edpm-deployment" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.595505 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.599111 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.599282 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.600350 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.601794 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.612124 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h"] Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.697178 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dbw5h\" (UID: \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.697300 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dbw5h\" (UID: \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.697324 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m65gz\" (UniqueName: \"kubernetes.io/projected/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-kube-api-access-m65gz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dbw5h\" (UID: \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.798873 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dbw5h\" (UID: \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.799293 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m65gz\" (UniqueName: \"kubernetes.io/projected/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-kube-api-access-m65gz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dbw5h\" (UID: \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.799447 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dbw5h\" (UID: \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.805469 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dbw5h\" (UID: \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.808662 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dbw5h\" (UID: \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.817203 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m65gz\" (UniqueName: \"kubernetes.io/projected/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-kube-api-access-m65gz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dbw5h\" (UID: \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:47 crc kubenswrapper[4717]: I0217 15:23:47.913614 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:48 crc kubenswrapper[4717]: I0217 15:23:48.503312 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h"] Feb 17 15:23:49 crc kubenswrapper[4717]: I0217 15:23:49.050416 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-4q8gf"] Feb 17 15:23:49 crc kubenswrapper[4717]: I0217 15:23:49.062066 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-4q8gf"] Feb 17 15:23:49 crc kubenswrapper[4717]: I0217 15:23:49.521330 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" event={"ID":"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a","Type":"ContainerStarted","Data":"8b2fa08a01e6f430e2c528e6bea8d738461fe85f0fcce6b722bd53976d48ac2b"} Feb 17 15:23:49 crc kubenswrapper[4717]: I0217 15:23:49.521372 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" event={"ID":"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a","Type":"ContainerStarted","Data":"4c9362aa24b46c52b9162d1d4ce3d804400af537536ec011a05cca8c9fdd6517"} Feb 17 15:23:49 crc kubenswrapper[4717]: I0217 15:23:49.555636 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" podStartSLOduration=2.080349038 podStartE2EDuration="2.555615757s" podCreationTimestamp="2026-02-17 15:23:47 +0000 UTC" firstStartedPulling="2026-02-17 15:23:48.513793039 +0000 UTC m=+1894.929633525" lastFinishedPulling="2026-02-17 15:23:48.989059738 +0000 UTC m=+1895.404900244" observedRunningTime="2026-02-17 15:23:49.541310811 +0000 UTC m=+1895.957151317" watchObservedRunningTime="2026-02-17 15:23:49.555615757 +0000 UTC m=+1895.971456233" Feb 17 15:23:49 crc kubenswrapper[4717]: I0217 15:23:49.870699 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9ed346a-8b4f-464f-8035-73f75ad5e83f" path="/var/lib/kubelet/pods/d9ed346a-8b4f-464f-8035-73f75ad5e83f/volumes" Feb 17 15:23:50 crc kubenswrapper[4717]: I0217 15:23:50.051404 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-68ckm"] Feb 17 15:23:50 crc kubenswrapper[4717]: I0217 15:23:50.082432 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-68ckm"] Feb 17 15:23:51 crc kubenswrapper[4717]: I0217 15:23:51.865191 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55362b6c-9d2e-4df5-887d-3955d617c166" path="/var/lib/kubelet/pods/55362b6c-9d2e-4df5-887d-3955d617c166/volumes" Feb 17 15:23:56 crc kubenswrapper[4717]: I0217 15:23:56.847054 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:23:56 crc kubenswrapper[4717]: E0217 15:23:56.847624 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:23:57 crc kubenswrapper[4717]: I0217 15:23:57.618821 4717 generic.go:334] "Generic (PLEG): container finished" podID="c60befe9-ac76-4ba1-9cd0-15154e3c4e7a" containerID="8b2fa08a01e6f430e2c528e6bea8d738461fe85f0fcce6b722bd53976d48ac2b" exitCode=0 Feb 17 15:23:57 crc kubenswrapper[4717]: I0217 15:23:57.618939 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" event={"ID":"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a","Type":"ContainerDied","Data":"8b2fa08a01e6f430e2c528e6bea8d738461fe85f0fcce6b722bd53976d48ac2b"} Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.179783 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.361298 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m65gz\" (UniqueName: \"kubernetes.io/projected/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-kube-api-access-m65gz\") pod \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\" (UID: \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\") " Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.361352 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-ssh-key-openstack-edpm-ipam\") pod \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\" (UID: \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\") " Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.361389 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-inventory\") pod \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\" (UID: \"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a\") " Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.367010 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-kube-api-access-m65gz" (OuterVolumeSpecName: "kube-api-access-m65gz") pod "c60befe9-ac76-4ba1-9cd0-15154e3c4e7a" (UID: "c60befe9-ac76-4ba1-9cd0-15154e3c4e7a"). InnerVolumeSpecName "kube-api-access-m65gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.397905 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c60befe9-ac76-4ba1-9cd0-15154e3c4e7a" (UID: "c60befe9-ac76-4ba1-9cd0-15154e3c4e7a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.418055 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-inventory" (OuterVolumeSpecName: "inventory") pod "c60befe9-ac76-4ba1-9cd0-15154e3c4e7a" (UID: "c60befe9-ac76-4ba1-9cd0-15154e3c4e7a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.463512 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m65gz\" (UniqueName: \"kubernetes.io/projected/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-kube-api-access-m65gz\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.463557 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.463577 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c60befe9-ac76-4ba1-9cd0-15154e3c4e7a-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.640608 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" event={"ID":"c60befe9-ac76-4ba1-9cd0-15154e3c4e7a","Type":"ContainerDied","Data":"4c9362aa24b46c52b9162d1d4ce3d804400af537536ec011a05cca8c9fdd6517"} Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.640669 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c9362aa24b46c52b9162d1d4ce3d804400af537536ec011a05cca8c9fdd6517" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.640669 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dbw5h" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.784190 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp"] Feb 17 15:23:59 crc kubenswrapper[4717]: E0217 15:23:59.784537 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c60befe9-ac76-4ba1-9cd0-15154e3c4e7a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.784549 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c60befe9-ac76-4ba1-9cd0-15154e3c4e7a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.784739 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c60befe9-ac76-4ba1-9cd0-15154e3c4e7a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.785394 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.788462 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.790032 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.798486 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.798621 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.806290 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp"] Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.871497 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcbb0902-ee67-4df1-b420-f299e4400354-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp\" (UID: \"dcbb0902-ee67-4df1-b420-f299e4400354\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.871560 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dcbb0902-ee67-4df1-b420-f299e4400354-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp\" (UID: \"dcbb0902-ee67-4df1-b420-f299e4400354\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.871731 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wnp9\" (UniqueName: \"kubernetes.io/projected/dcbb0902-ee67-4df1-b420-f299e4400354-kube-api-access-7wnp9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp\" (UID: \"dcbb0902-ee67-4df1-b420-f299e4400354\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.973231 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcbb0902-ee67-4df1-b420-f299e4400354-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp\" (UID: \"dcbb0902-ee67-4df1-b420-f299e4400354\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.973307 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dcbb0902-ee67-4df1-b420-f299e4400354-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp\" (UID: \"dcbb0902-ee67-4df1-b420-f299e4400354\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.973356 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wnp9\" (UniqueName: \"kubernetes.io/projected/dcbb0902-ee67-4df1-b420-f299e4400354-kube-api-access-7wnp9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp\" (UID: \"dcbb0902-ee67-4df1-b420-f299e4400354\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.976565 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dcbb0902-ee67-4df1-b420-f299e4400354-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp\" (UID: \"dcbb0902-ee67-4df1-b420-f299e4400354\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.977431 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcbb0902-ee67-4df1-b420-f299e4400354-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp\" (UID: \"dcbb0902-ee67-4df1-b420-f299e4400354\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:23:59 crc kubenswrapper[4717]: I0217 15:23:59.993896 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wnp9\" (UniqueName: \"kubernetes.io/projected/dcbb0902-ee67-4df1-b420-f299e4400354-kube-api-access-7wnp9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp\" (UID: \"dcbb0902-ee67-4df1-b420-f299e4400354\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:24:00 crc kubenswrapper[4717]: I0217 15:24:00.166558 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:24:00 crc kubenswrapper[4717]: I0217 15:24:00.699974 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp"] Feb 17 15:24:01 crc kubenswrapper[4717]: I0217 15:24:01.240193 4717 scope.go:117] "RemoveContainer" containerID="ef9639468077aac0cb62b975f12d75e6bc1ad3fa6c8c6e7a33b6a043a4a20a09" Feb 17 15:24:01 crc kubenswrapper[4717]: I0217 15:24:01.315678 4717 scope.go:117] "RemoveContainer" containerID="8448081ee7799d961d0b56d916c4f02569f02a25b1507c1aacfd35e596dee9d6" Feb 17 15:24:01 crc kubenswrapper[4717]: I0217 15:24:01.412290 4717 scope.go:117] "RemoveContainer" containerID="5f3c66ec3024fd89a219ff95045d9d9435a2abf6bfbc11ee51d5732acf582ac4" Feb 17 15:24:01 crc kubenswrapper[4717]: I0217 15:24:01.665563 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" event={"ID":"dcbb0902-ee67-4df1-b420-f299e4400354","Type":"ContainerStarted","Data":"80dc0c0c4bea911fe9cd5c9d29f0d44e01b81e451a1c6304cb5e62f189dfceda"} Feb 17 15:24:01 crc kubenswrapper[4717]: I0217 15:24:01.665643 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" event={"ID":"dcbb0902-ee67-4df1-b420-f299e4400354","Type":"ContainerStarted","Data":"ddf2d526bfae9ce5cf67458cb1e26c652e57a232de17dfb770eda29a660c81b2"} Feb 17 15:24:01 crc kubenswrapper[4717]: I0217 15:24:01.684559 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" podStartSLOduration=2.245003178 podStartE2EDuration="2.684541784s" podCreationTimestamp="2026-02-17 15:23:59 +0000 UTC" firstStartedPulling="2026-02-17 15:24:00.709406797 +0000 UTC m=+1907.125247283" lastFinishedPulling="2026-02-17 15:24:01.148945423 +0000 UTC m=+1907.564785889" observedRunningTime="2026-02-17 15:24:01.683386151 +0000 UTC m=+1908.099226657" watchObservedRunningTime="2026-02-17 15:24:01.684541784 +0000 UTC m=+1908.100382260" Feb 17 15:24:07 crc kubenswrapper[4717]: I0217 15:24:07.847862 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:24:07 crc kubenswrapper[4717]: E0217 15:24:07.848951 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:24:10 crc kubenswrapper[4717]: I0217 15:24:10.795338 4717 generic.go:334] "Generic (PLEG): container finished" podID="dcbb0902-ee67-4df1-b420-f299e4400354" containerID="80dc0c0c4bea911fe9cd5c9d29f0d44e01b81e451a1c6304cb5e62f189dfceda" exitCode=0 Feb 17 15:24:10 crc kubenswrapper[4717]: I0217 15:24:10.795786 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" event={"ID":"dcbb0902-ee67-4df1-b420-f299e4400354","Type":"ContainerDied","Data":"80dc0c0c4bea911fe9cd5c9d29f0d44e01b81e451a1c6304cb5e62f189dfceda"} Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.352610 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.422638 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wnp9\" (UniqueName: \"kubernetes.io/projected/dcbb0902-ee67-4df1-b420-f299e4400354-kube-api-access-7wnp9\") pod \"dcbb0902-ee67-4df1-b420-f299e4400354\" (UID: \"dcbb0902-ee67-4df1-b420-f299e4400354\") " Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.422710 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dcbb0902-ee67-4df1-b420-f299e4400354-ssh-key-openstack-edpm-ipam\") pod \"dcbb0902-ee67-4df1-b420-f299e4400354\" (UID: \"dcbb0902-ee67-4df1-b420-f299e4400354\") " Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.422840 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcbb0902-ee67-4df1-b420-f299e4400354-inventory\") pod \"dcbb0902-ee67-4df1-b420-f299e4400354\" (UID: \"dcbb0902-ee67-4df1-b420-f299e4400354\") " Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.432883 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcbb0902-ee67-4df1-b420-f299e4400354-kube-api-access-7wnp9" (OuterVolumeSpecName: "kube-api-access-7wnp9") pod "dcbb0902-ee67-4df1-b420-f299e4400354" (UID: "dcbb0902-ee67-4df1-b420-f299e4400354"). InnerVolumeSpecName "kube-api-access-7wnp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.465155 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbb0902-ee67-4df1-b420-f299e4400354-inventory" (OuterVolumeSpecName: "inventory") pod "dcbb0902-ee67-4df1-b420-f299e4400354" (UID: "dcbb0902-ee67-4df1-b420-f299e4400354"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.476225 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcbb0902-ee67-4df1-b420-f299e4400354-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dcbb0902-ee67-4df1-b420-f299e4400354" (UID: "dcbb0902-ee67-4df1-b420-f299e4400354"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.525463 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcbb0902-ee67-4df1-b420-f299e4400354-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.525497 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wnp9\" (UniqueName: \"kubernetes.io/projected/dcbb0902-ee67-4df1-b420-f299e4400354-kube-api-access-7wnp9\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.525509 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dcbb0902-ee67-4df1-b420-f299e4400354-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.819405 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" event={"ID":"dcbb0902-ee67-4df1-b420-f299e4400354","Type":"ContainerDied","Data":"ddf2d526bfae9ce5cf67458cb1e26c652e57a232de17dfb770eda29a660c81b2"} Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.819444 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddf2d526bfae9ce5cf67458cb1e26c652e57a232de17dfb770eda29a660c81b2" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.819477 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.931901 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk"] Feb 17 15:24:12 crc kubenswrapper[4717]: E0217 15:24:12.932624 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcbb0902-ee67-4df1-b420-f299e4400354" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.932730 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcbb0902-ee67-4df1-b420-f299e4400354" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.933121 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcbb0902-ee67-4df1-b420-f299e4400354" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.933976 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.937360 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.937452 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.937363 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.937856 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.937986 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.938031 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.938269 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.942568 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:24:12 crc kubenswrapper[4717]: I0217 15:24:12.962807 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk"] Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035380 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035425 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035458 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035488 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035524 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035569 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035600 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035656 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035682 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035718 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035738 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035782 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035803 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8jwj\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-kube-api-access-p8jwj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.035821 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138073 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138154 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138193 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138242 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138289 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138330 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138401 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138499 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138540 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138564 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138587 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138624 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138657 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8jwj\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-kube-api-access-p8jwj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.138693 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.144496 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.154671 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.155323 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.155516 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.155903 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.156969 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.157000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.160586 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.160689 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.173729 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.173867 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.175822 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.180599 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.180788 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8jwj\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-kube-api-access-p8jwj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.259673 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.801690 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk"] Feb 17 15:24:13 crc kubenswrapper[4717]: W0217 15:24:13.803046 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d553699_af50_41f7_b4da_1e0182788f60.slice/crio-41ac82e4e8331dc9c682ba20f6d2875b53efa0cbf71d376db58bebec5ab9eed1 WatchSource:0}: Error finding container 41ac82e4e8331dc9c682ba20f6d2875b53efa0cbf71d376db58bebec5ab9eed1: Status 404 returned error can't find the container with id 41ac82e4e8331dc9c682ba20f6d2875b53efa0cbf71d376db58bebec5ab9eed1 Feb 17 15:24:13 crc kubenswrapper[4717]: I0217 15:24:13.831006 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" event={"ID":"8d553699-af50-41f7-b4da-1e0182788f60","Type":"ContainerStarted","Data":"41ac82e4e8331dc9c682ba20f6d2875b53efa0cbf71d376db58bebec5ab9eed1"} Feb 17 15:24:14 crc kubenswrapper[4717]: I0217 15:24:14.839882 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" event={"ID":"8d553699-af50-41f7-b4da-1e0182788f60","Type":"ContainerStarted","Data":"783fc1f9b4f88eb0eb98b68ae508c07b0a2dd1dacc2ffd665afa2c3657d390e0"} Feb 17 15:24:14 crc kubenswrapper[4717]: I0217 15:24:14.871307 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" podStartSLOduration=2.297412466 podStartE2EDuration="2.871290642s" podCreationTimestamp="2026-02-17 15:24:12 +0000 UTC" firstStartedPulling="2026-02-17 15:24:13.807173732 +0000 UTC m=+1920.223014228" lastFinishedPulling="2026-02-17 15:24:14.381051898 +0000 UTC m=+1920.796892404" observedRunningTime="2026-02-17 15:24:14.863709727 +0000 UTC m=+1921.279550203" watchObservedRunningTime="2026-02-17 15:24:14.871290642 +0000 UTC m=+1921.287131118" Feb 17 15:24:19 crc kubenswrapper[4717]: I0217 15:24:19.849253 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:24:19 crc kubenswrapper[4717]: E0217 15:24:19.851726 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:24:33 crc kubenswrapper[4717]: I0217 15:24:33.067196 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-qzr8m"] Feb 17 15:24:33 crc kubenswrapper[4717]: I0217 15:24:33.078655 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-qzr8m"] Feb 17 15:24:33 crc kubenswrapper[4717]: I0217 15:24:33.862816 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b2abfd-5466-4810-b0ba-dfd4a956549b" path="/var/lib/kubelet/pods/94b2abfd-5466-4810-b0ba-dfd4a956549b/volumes" Feb 17 15:24:34 crc kubenswrapper[4717]: I0217 15:24:34.847692 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:24:34 crc kubenswrapper[4717]: E0217 15:24:34.848220 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:24:46 crc kubenswrapper[4717]: I0217 15:24:46.847206 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:24:46 crc kubenswrapper[4717]: E0217 15:24:46.848431 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:24:54 crc kubenswrapper[4717]: I0217 15:24:54.245836 4717 generic.go:334] "Generic (PLEG): container finished" podID="8d553699-af50-41f7-b4da-1e0182788f60" containerID="783fc1f9b4f88eb0eb98b68ae508c07b0a2dd1dacc2ffd665afa2c3657d390e0" exitCode=0 Feb 17 15:24:54 crc kubenswrapper[4717]: I0217 15:24:54.245919 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" event={"ID":"8d553699-af50-41f7-b4da-1e0182788f60","Type":"ContainerDied","Data":"783fc1f9b4f88eb0eb98b68ae508c07b0a2dd1dacc2ffd665afa2c3657d390e0"} Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.723419 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.882606 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-libvirt-combined-ca-bundle\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.883599 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-telemetry-combined-ca-bundle\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.883710 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-nova-combined-ca-bundle\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.883838 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.883962 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-ssh-key-openstack-edpm-ipam\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.884040 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-bootstrap-combined-ca-bundle\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.884162 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8jwj\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-kube-api-access-p8jwj\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.884839 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-repo-setup-combined-ca-bundle\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.884981 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-inventory\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.885142 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.885237 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-ovn-default-certs-0\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.885538 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.885633 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-ovn-combined-ca-bundle\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.885747 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-neutron-metadata-combined-ca-bundle\") pod \"8d553699-af50-41f7-b4da-1e0182788f60\" (UID: \"8d553699-af50-41f7-b4da-1e0182788f60\") " Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.890594 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.891016 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.891120 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.891280 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-kube-api-access-p8jwj" (OuterVolumeSpecName: "kube-api-access-p8jwj") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "kube-api-access-p8jwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.892008 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.892019 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.893024 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.893568 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.893938 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.894025 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.894367 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.894640 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.920131 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-inventory" (OuterVolumeSpecName: "inventory") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.921265 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8d553699-af50-41f7-b4da-1e0182788f60" (UID: "8d553699-af50-41f7-b4da-1e0182788f60"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989232 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989272 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989289 4717 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989301 4717 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989311 4717 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989321 4717 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989335 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989345 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989355 4717 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989364 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8jwj\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-kube-api-access-p8jwj\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989373 4717 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989386 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d553699-af50-41f7-b4da-1e0182788f60-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989397 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:55 crc kubenswrapper[4717]: I0217 15:24:55.989410 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8d553699-af50-41f7-b4da-1e0182788f60-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.275597 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" event={"ID":"8d553699-af50-41f7-b4da-1e0182788f60","Type":"ContainerDied","Data":"41ac82e4e8331dc9c682ba20f6d2875b53efa0cbf71d376db58bebec5ab9eed1"} Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.275681 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41ac82e4e8331dc9c682ba20f6d2875b53efa0cbf71d376db58bebec5ab9eed1" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.275848 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.402126 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4"] Feb 17 15:24:56 crc kubenswrapper[4717]: E0217 15:24:56.402639 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d553699-af50-41f7-b4da-1e0182788f60" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.402672 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d553699-af50-41f7-b4da-1e0182788f60" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.402982 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d553699-af50-41f7-b4da-1e0182788f60" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.404572 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.406941 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.407134 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.407926 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.408196 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.408542 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.414599 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4"] Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.603316 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.603510 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.603777 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5cxl\" (UniqueName: \"kubernetes.io/projected/f366c26c-4b32-488e-8738-dbbf0ddd3adc-kube-api-access-k5cxl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.603946 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.604037 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.706323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5cxl\" (UniqueName: \"kubernetes.io/projected/f366c26c-4b32-488e-8738-dbbf0ddd3adc-kube-api-access-k5cxl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.706391 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.706440 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.706527 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.706593 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.708165 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.710881 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.712123 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.712709 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.727127 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5cxl\" (UniqueName: \"kubernetes.io/projected/f366c26c-4b32-488e-8738-dbbf0ddd3adc-kube-api-access-k5cxl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mvpm4\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:56 crc kubenswrapper[4717]: I0217 15:24:56.737075 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:24:57 crc kubenswrapper[4717]: I0217 15:24:57.321803 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4"] Feb 17 15:24:57 crc kubenswrapper[4717]: I0217 15:24:57.323201 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:24:58 crc kubenswrapper[4717]: I0217 15:24:58.313129 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" event={"ID":"f366c26c-4b32-488e-8738-dbbf0ddd3adc","Type":"ContainerStarted","Data":"20d987a8291a2b34004cc6c28cca5c9a3f4735197a5b28b3ee36c69d50fd19d8"} Feb 17 15:24:58 crc kubenswrapper[4717]: I0217 15:24:58.313526 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" event={"ID":"f366c26c-4b32-488e-8738-dbbf0ddd3adc","Type":"ContainerStarted","Data":"e2d636bf7b18367fa92186790a62bb859982a23c1ea51164c977b7fbee3ae9a8"} Feb 17 15:24:58 crc kubenswrapper[4717]: I0217 15:24:58.341258 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" podStartSLOduration=1.831713181 podStartE2EDuration="2.341230272s" podCreationTimestamp="2026-02-17 15:24:56 +0000 UTC" firstStartedPulling="2026-02-17 15:24:57.322753676 +0000 UTC m=+1963.738594192" lastFinishedPulling="2026-02-17 15:24:57.832270807 +0000 UTC m=+1964.248111283" observedRunningTime="2026-02-17 15:24:58.336760685 +0000 UTC m=+1964.752601181" watchObservedRunningTime="2026-02-17 15:24:58.341230272 +0000 UTC m=+1964.757070778" Feb 17 15:25:00 crc kubenswrapper[4717]: I0217 15:25:00.846997 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:25:00 crc kubenswrapper[4717]: E0217 15:25:00.847327 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:25:01 crc kubenswrapper[4717]: I0217 15:25:01.544366 4717 scope.go:117] "RemoveContainer" containerID="03d544f8840202792cf82407d5091b13d23e943807fe029aee1d4cf98070d1ab" Feb 17 15:25:14 crc kubenswrapper[4717]: I0217 15:25:14.846917 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:25:14 crc kubenswrapper[4717]: E0217 15:25:14.847712 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:25:15 crc kubenswrapper[4717]: I0217 15:25:15.770602 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cnpsk"] Feb 17 15:25:15 crc kubenswrapper[4717]: I0217 15:25:15.774224 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:15 crc kubenswrapper[4717]: I0217 15:25:15.796125 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cnpsk"] Feb 17 15:25:15 crc kubenswrapper[4717]: I0217 15:25:15.870534 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a89abed-472d-412d-9b9a-60fd5ada6584-catalog-content\") pod \"community-operators-cnpsk\" (UID: \"7a89abed-472d-412d-9b9a-60fd5ada6584\") " pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:15 crc kubenswrapper[4717]: I0217 15:25:15.870598 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a89abed-472d-412d-9b9a-60fd5ada6584-utilities\") pod \"community-operators-cnpsk\" (UID: \"7a89abed-472d-412d-9b9a-60fd5ada6584\") " pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:15 crc kubenswrapper[4717]: I0217 15:25:15.870659 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dgrh\" (UniqueName: \"kubernetes.io/projected/7a89abed-472d-412d-9b9a-60fd5ada6584-kube-api-access-2dgrh\") pod \"community-operators-cnpsk\" (UID: \"7a89abed-472d-412d-9b9a-60fd5ada6584\") " pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:15 crc kubenswrapper[4717]: I0217 15:25:15.972752 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a89abed-472d-412d-9b9a-60fd5ada6584-catalog-content\") pod \"community-operators-cnpsk\" (UID: \"7a89abed-472d-412d-9b9a-60fd5ada6584\") " pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:15 crc kubenswrapper[4717]: I0217 15:25:15.973098 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a89abed-472d-412d-9b9a-60fd5ada6584-utilities\") pod \"community-operators-cnpsk\" (UID: \"7a89abed-472d-412d-9b9a-60fd5ada6584\") " pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:15 crc kubenswrapper[4717]: I0217 15:25:15.973191 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dgrh\" (UniqueName: \"kubernetes.io/projected/7a89abed-472d-412d-9b9a-60fd5ada6584-kube-api-access-2dgrh\") pod \"community-operators-cnpsk\" (UID: \"7a89abed-472d-412d-9b9a-60fd5ada6584\") " pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:15 crc kubenswrapper[4717]: I0217 15:25:15.973600 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a89abed-472d-412d-9b9a-60fd5ada6584-utilities\") pod \"community-operators-cnpsk\" (UID: \"7a89abed-472d-412d-9b9a-60fd5ada6584\") " pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:15 crc kubenswrapper[4717]: I0217 15:25:15.973634 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a89abed-472d-412d-9b9a-60fd5ada6584-catalog-content\") pod \"community-operators-cnpsk\" (UID: \"7a89abed-472d-412d-9b9a-60fd5ada6584\") " pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:16 crc kubenswrapper[4717]: I0217 15:25:15.995275 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dgrh\" (UniqueName: \"kubernetes.io/projected/7a89abed-472d-412d-9b9a-60fd5ada6584-kube-api-access-2dgrh\") pod \"community-operators-cnpsk\" (UID: \"7a89abed-472d-412d-9b9a-60fd5ada6584\") " pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:16 crc kubenswrapper[4717]: I0217 15:25:16.103421 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:16 crc kubenswrapper[4717]: I0217 15:25:16.669592 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cnpsk"] Feb 17 15:25:16 crc kubenswrapper[4717]: W0217 15:25:16.679311 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a89abed_472d_412d_9b9a_60fd5ada6584.slice/crio-b68fc98f0c7497afd6e2b5fe39fd32b9b98be5d8ae5ac94c332f9bf3a6049600 WatchSource:0}: Error finding container b68fc98f0c7497afd6e2b5fe39fd32b9b98be5d8ae5ac94c332f9bf3a6049600: Status 404 returned error can't find the container with id b68fc98f0c7497afd6e2b5fe39fd32b9b98be5d8ae5ac94c332f9bf3a6049600 Feb 17 15:25:17 crc kubenswrapper[4717]: I0217 15:25:17.502006 4717 generic.go:334] "Generic (PLEG): container finished" podID="7a89abed-472d-412d-9b9a-60fd5ada6584" containerID="edbc7cde21ee34f55c02efea44b109fc94c63dea09161937867793332281c10d" exitCode=0 Feb 17 15:25:17 crc kubenswrapper[4717]: I0217 15:25:17.502094 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnpsk" event={"ID":"7a89abed-472d-412d-9b9a-60fd5ada6584","Type":"ContainerDied","Data":"edbc7cde21ee34f55c02efea44b109fc94c63dea09161937867793332281c10d"} Feb 17 15:25:17 crc kubenswrapper[4717]: I0217 15:25:17.502304 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnpsk" event={"ID":"7a89abed-472d-412d-9b9a-60fd5ada6584","Type":"ContainerStarted","Data":"b68fc98f0c7497afd6e2b5fe39fd32b9b98be5d8ae5ac94c332f9bf3a6049600"} Feb 17 15:25:18 crc kubenswrapper[4717]: I0217 15:25:18.515769 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnpsk" event={"ID":"7a89abed-472d-412d-9b9a-60fd5ada6584","Type":"ContainerStarted","Data":"372220de8e014e82f3294255d30835ad90369683ba356bd9fec045a0ac9757f2"} Feb 17 15:25:19 crc kubenswrapper[4717]: I0217 15:25:19.528005 4717 generic.go:334] "Generic (PLEG): container finished" podID="7a89abed-472d-412d-9b9a-60fd5ada6584" containerID="372220de8e014e82f3294255d30835ad90369683ba356bd9fec045a0ac9757f2" exitCode=0 Feb 17 15:25:19 crc kubenswrapper[4717]: I0217 15:25:19.528165 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnpsk" event={"ID":"7a89abed-472d-412d-9b9a-60fd5ada6584","Type":"ContainerDied","Data":"372220de8e014e82f3294255d30835ad90369683ba356bd9fec045a0ac9757f2"} Feb 17 15:25:20 crc kubenswrapper[4717]: I0217 15:25:20.543650 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnpsk" event={"ID":"7a89abed-472d-412d-9b9a-60fd5ada6584","Type":"ContainerStarted","Data":"48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a"} Feb 17 15:25:20 crc kubenswrapper[4717]: I0217 15:25:20.574151 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cnpsk" podStartSLOduration=3.128175943 podStartE2EDuration="5.574129094s" podCreationTimestamp="2026-02-17 15:25:15 +0000 UTC" firstStartedPulling="2026-02-17 15:25:17.503583479 +0000 UTC m=+1983.919423955" lastFinishedPulling="2026-02-17 15:25:19.94953659 +0000 UTC m=+1986.365377106" observedRunningTime="2026-02-17 15:25:20.564878812 +0000 UTC m=+1986.980719298" watchObservedRunningTime="2026-02-17 15:25:20.574129094 +0000 UTC m=+1986.989969600" Feb 17 15:25:26 crc kubenswrapper[4717]: I0217 15:25:26.104253 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:26 crc kubenswrapper[4717]: I0217 15:25:26.105207 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:26 crc kubenswrapper[4717]: I0217 15:25:26.190791 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:26 crc kubenswrapper[4717]: I0217 15:25:26.663891 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:27 crc kubenswrapper[4717]: I0217 15:25:27.555702 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cnpsk"] Feb 17 15:25:28 crc kubenswrapper[4717]: I0217 15:25:28.626504 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cnpsk" podUID="7a89abed-472d-412d-9b9a-60fd5ada6584" containerName="registry-server" containerID="cri-o://48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a" gracePeriod=2 Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.159823 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.260374 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a89abed-472d-412d-9b9a-60fd5ada6584-utilities\") pod \"7a89abed-472d-412d-9b9a-60fd5ada6584\" (UID: \"7a89abed-472d-412d-9b9a-60fd5ada6584\") " Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.260641 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a89abed-472d-412d-9b9a-60fd5ada6584-catalog-content\") pod \"7a89abed-472d-412d-9b9a-60fd5ada6584\" (UID: \"7a89abed-472d-412d-9b9a-60fd5ada6584\") " Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.260681 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dgrh\" (UniqueName: \"kubernetes.io/projected/7a89abed-472d-412d-9b9a-60fd5ada6584-kube-api-access-2dgrh\") pod \"7a89abed-472d-412d-9b9a-60fd5ada6584\" (UID: \"7a89abed-472d-412d-9b9a-60fd5ada6584\") " Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.261857 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a89abed-472d-412d-9b9a-60fd5ada6584-utilities" (OuterVolumeSpecName: "utilities") pod "7a89abed-472d-412d-9b9a-60fd5ada6584" (UID: "7a89abed-472d-412d-9b9a-60fd5ada6584"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.268599 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a89abed-472d-412d-9b9a-60fd5ada6584-kube-api-access-2dgrh" (OuterVolumeSpecName: "kube-api-access-2dgrh") pod "7a89abed-472d-412d-9b9a-60fd5ada6584" (UID: "7a89abed-472d-412d-9b9a-60fd5ada6584"). InnerVolumeSpecName "kube-api-access-2dgrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.321503 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a89abed-472d-412d-9b9a-60fd5ada6584-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a89abed-472d-412d-9b9a-60fd5ada6584" (UID: "7a89abed-472d-412d-9b9a-60fd5ada6584"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.363578 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a89abed-472d-412d-9b9a-60fd5ada6584-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.363620 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dgrh\" (UniqueName: \"kubernetes.io/projected/7a89abed-472d-412d-9b9a-60fd5ada6584-kube-api-access-2dgrh\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.363632 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a89abed-472d-412d-9b9a-60fd5ada6584-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.640022 4717 generic.go:334] "Generic (PLEG): container finished" podID="7a89abed-472d-412d-9b9a-60fd5ada6584" containerID="48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a" exitCode=0 Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.640130 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnpsk" event={"ID":"7a89abed-472d-412d-9b9a-60fd5ada6584","Type":"ContainerDied","Data":"48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a"} Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.640195 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnpsk" event={"ID":"7a89abed-472d-412d-9b9a-60fd5ada6584","Type":"ContainerDied","Data":"b68fc98f0c7497afd6e2b5fe39fd32b9b98be5d8ae5ac94c332f9bf3a6049600"} Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.640231 4717 scope.go:117] "RemoveContainer" containerID="48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.640235 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnpsk" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.672020 4717 scope.go:117] "RemoveContainer" containerID="372220de8e014e82f3294255d30835ad90369683ba356bd9fec045a0ac9757f2" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.690444 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cnpsk"] Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.700260 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cnpsk"] Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.711540 4717 scope.go:117] "RemoveContainer" containerID="edbc7cde21ee34f55c02efea44b109fc94c63dea09161937867793332281c10d" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.762561 4717 scope.go:117] "RemoveContainer" containerID="48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a" Feb 17 15:25:29 crc kubenswrapper[4717]: E0217 15:25:29.762925 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a\": container with ID starting with 48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a not found: ID does not exist" containerID="48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.762956 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a"} err="failed to get container status \"48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a\": rpc error: code = NotFound desc = could not find container \"48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a\": container with ID starting with 48f269413090df614b67f34db6f968723b8a551089ebfb4292aa8d393c13e78a not found: ID does not exist" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.762975 4717 scope.go:117] "RemoveContainer" containerID="372220de8e014e82f3294255d30835ad90369683ba356bd9fec045a0ac9757f2" Feb 17 15:25:29 crc kubenswrapper[4717]: E0217 15:25:29.763202 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372220de8e014e82f3294255d30835ad90369683ba356bd9fec045a0ac9757f2\": container with ID starting with 372220de8e014e82f3294255d30835ad90369683ba356bd9fec045a0ac9757f2 not found: ID does not exist" containerID="372220de8e014e82f3294255d30835ad90369683ba356bd9fec045a0ac9757f2" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.763235 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372220de8e014e82f3294255d30835ad90369683ba356bd9fec045a0ac9757f2"} err="failed to get container status \"372220de8e014e82f3294255d30835ad90369683ba356bd9fec045a0ac9757f2\": rpc error: code = NotFound desc = could not find container \"372220de8e014e82f3294255d30835ad90369683ba356bd9fec045a0ac9757f2\": container with ID starting with 372220de8e014e82f3294255d30835ad90369683ba356bd9fec045a0ac9757f2 not found: ID does not exist" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.763256 4717 scope.go:117] "RemoveContainer" containerID="edbc7cde21ee34f55c02efea44b109fc94c63dea09161937867793332281c10d" Feb 17 15:25:29 crc kubenswrapper[4717]: E0217 15:25:29.763458 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edbc7cde21ee34f55c02efea44b109fc94c63dea09161937867793332281c10d\": container with ID starting with edbc7cde21ee34f55c02efea44b109fc94c63dea09161937867793332281c10d not found: ID does not exist" containerID="edbc7cde21ee34f55c02efea44b109fc94c63dea09161937867793332281c10d" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.763489 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbc7cde21ee34f55c02efea44b109fc94c63dea09161937867793332281c10d"} err="failed to get container status \"edbc7cde21ee34f55c02efea44b109fc94c63dea09161937867793332281c10d\": rpc error: code = NotFound desc = could not find container \"edbc7cde21ee34f55c02efea44b109fc94c63dea09161937867793332281c10d\": container with ID starting with edbc7cde21ee34f55c02efea44b109fc94c63dea09161937867793332281c10d not found: ID does not exist" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.847226 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:25:29 crc kubenswrapper[4717]: I0217 15:25:29.858725 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a89abed-472d-412d-9b9a-60fd5ada6584" path="/var/lib/kubelet/pods/7a89abed-472d-412d-9b9a-60fd5ada6584/volumes" Feb 17 15:25:30 crc kubenswrapper[4717]: I0217 15:25:30.659700 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"f2ddaca91157a7f3cebc0d5a4db9348eb8786a6e148854bda086735851b9900e"} Feb 17 15:26:02 crc kubenswrapper[4717]: I0217 15:26:02.994710 4717 generic.go:334] "Generic (PLEG): container finished" podID="f366c26c-4b32-488e-8738-dbbf0ddd3adc" containerID="20d987a8291a2b34004cc6c28cca5c9a3f4735197a5b28b3ee36c69d50fd19d8" exitCode=0 Feb 17 15:26:02 crc kubenswrapper[4717]: I0217 15:26:02.995567 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" event={"ID":"f366c26c-4b32-488e-8738-dbbf0ddd3adc","Type":"ContainerDied","Data":"20d987a8291a2b34004cc6c28cca5c9a3f4735197a5b28b3ee36c69d50fd19d8"} Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.507589 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.634886 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5cxl\" (UniqueName: \"kubernetes.io/projected/f366c26c-4b32-488e-8738-dbbf0ddd3adc-kube-api-access-k5cxl\") pod \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.635415 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ovncontroller-config-0\") pod \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.635513 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ovn-combined-ca-bundle\") pod \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.635673 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ssh-key-openstack-edpm-ipam\") pod \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.635792 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-inventory\") pod \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\" (UID: \"f366c26c-4b32-488e-8738-dbbf0ddd3adc\") " Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.643118 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f366c26c-4b32-488e-8738-dbbf0ddd3adc" (UID: "f366c26c-4b32-488e-8738-dbbf0ddd3adc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.647364 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f366c26c-4b32-488e-8738-dbbf0ddd3adc-kube-api-access-k5cxl" (OuterVolumeSpecName: "kube-api-access-k5cxl") pod "f366c26c-4b32-488e-8738-dbbf0ddd3adc" (UID: "f366c26c-4b32-488e-8738-dbbf0ddd3adc"). InnerVolumeSpecName "kube-api-access-k5cxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.673905 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f366c26c-4b32-488e-8738-dbbf0ddd3adc" (UID: "f366c26c-4b32-488e-8738-dbbf0ddd3adc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.674426 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-inventory" (OuterVolumeSpecName: "inventory") pod "f366c26c-4b32-488e-8738-dbbf0ddd3adc" (UID: "f366c26c-4b32-488e-8738-dbbf0ddd3adc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.689051 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "f366c26c-4b32-488e-8738-dbbf0ddd3adc" (UID: "f366c26c-4b32-488e-8738-dbbf0ddd3adc"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.738332 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.738378 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.738389 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f366c26c-4b32-488e-8738-dbbf0ddd3adc-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.738399 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5cxl\" (UniqueName: \"kubernetes.io/projected/f366c26c-4b32-488e-8738-dbbf0ddd3adc-kube-api-access-k5cxl\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:04 crc kubenswrapper[4717]: I0217 15:26:04.738409 4717 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f366c26c-4b32-488e-8738-dbbf0ddd3adc-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.018113 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" event={"ID":"f366c26c-4b32-488e-8738-dbbf0ddd3adc","Type":"ContainerDied","Data":"e2d636bf7b18367fa92186790a62bb859982a23c1ea51164c977b7fbee3ae9a8"} Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.018192 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2d636bf7b18367fa92186790a62bb859982a23c1ea51164c977b7fbee3ae9a8" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.018488 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mvpm4" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.298507 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb"] Feb 17 15:26:05 crc kubenswrapper[4717]: E0217 15:26:05.298969 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a89abed-472d-412d-9b9a-60fd5ada6584" containerName="extract-content" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.298991 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a89abed-472d-412d-9b9a-60fd5ada6584" containerName="extract-content" Feb 17 15:26:05 crc kubenswrapper[4717]: E0217 15:26:05.299006 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a89abed-472d-412d-9b9a-60fd5ada6584" containerName="extract-utilities" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.299015 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a89abed-472d-412d-9b9a-60fd5ada6584" containerName="extract-utilities" Feb 17 15:26:05 crc kubenswrapper[4717]: E0217 15:26:05.299033 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f366c26c-4b32-488e-8738-dbbf0ddd3adc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.299041 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f366c26c-4b32-488e-8738-dbbf0ddd3adc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 15:26:05 crc kubenswrapper[4717]: E0217 15:26:05.299057 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a89abed-472d-412d-9b9a-60fd5ada6584" containerName="registry-server" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.299064 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a89abed-472d-412d-9b9a-60fd5ada6584" containerName="registry-server" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.299314 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a89abed-472d-412d-9b9a-60fd5ada6584" containerName="registry-server" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.299345 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f366c26c-4b32-488e-8738-dbbf0ddd3adc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.300168 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.303463 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.303994 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.304158 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.304309 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.304494 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.304689 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.313453 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb"] Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.352268 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.352699 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmb2f\" (UniqueName: \"kubernetes.io/projected/e4ab62d3-2bec-418d-ad69-7d384f86652c-kube-api-access-mmb2f\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.352815 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.352871 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.352907 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.352937 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.454588 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.454656 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmb2f\" (UniqueName: \"kubernetes.io/projected/e4ab62d3-2bec-418d-ad69-7d384f86652c-kube-api-access-mmb2f\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.454773 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.454833 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.454872 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.454902 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.462617 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.463503 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.464070 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.466947 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.467129 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.481807 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmb2f\" (UniqueName: \"kubernetes.io/projected/e4ab62d3-2bec-418d-ad69-7d384f86652c-kube-api-access-mmb2f\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:05 crc kubenswrapper[4717]: I0217 15:26:05.626517 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:06 crc kubenswrapper[4717]: I0217 15:26:06.098219 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb"] Feb 17 15:26:07 crc kubenswrapper[4717]: I0217 15:26:07.040561 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" event={"ID":"e4ab62d3-2bec-418d-ad69-7d384f86652c","Type":"ContainerStarted","Data":"002aa1faee3b2bb4817ce2179eded39b5841c6ac6c684fb0066c7041298cd648"} Feb 17 15:26:07 crc kubenswrapper[4717]: I0217 15:26:07.042747 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" event={"ID":"e4ab62d3-2bec-418d-ad69-7d384f86652c","Type":"ContainerStarted","Data":"74b167f415599d228b2962d138bf759a27f4be6246c7410469740032dcc6c5dd"} Feb 17 15:26:07 crc kubenswrapper[4717]: I0217 15:26:07.075343 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" podStartSLOduration=1.610139912 podStartE2EDuration="2.075327265s" podCreationTimestamp="2026-02-17 15:26:05 +0000 UTC" firstStartedPulling="2026-02-17 15:26:06.10907485 +0000 UTC m=+2032.524915326" lastFinishedPulling="2026-02-17 15:26:06.574262163 +0000 UTC m=+2032.990102679" observedRunningTime="2026-02-17 15:26:07.070054595 +0000 UTC m=+2033.485895081" watchObservedRunningTime="2026-02-17 15:26:07.075327265 +0000 UTC m=+2033.491167741" Feb 17 15:26:56 crc kubenswrapper[4717]: I0217 15:26:56.533883 4717 generic.go:334] "Generic (PLEG): container finished" podID="e4ab62d3-2bec-418d-ad69-7d384f86652c" containerID="002aa1faee3b2bb4817ce2179eded39b5841c6ac6c684fb0066c7041298cd648" exitCode=0 Feb 17 15:26:56 crc kubenswrapper[4717]: I0217 15:26:56.534003 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" event={"ID":"e4ab62d3-2bec-418d-ad69-7d384f86652c","Type":"ContainerDied","Data":"002aa1faee3b2bb4817ce2179eded39b5841c6ac6c684fb0066c7041298cd648"} Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.020650 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.125242 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-inventory\") pod \"e4ab62d3-2bec-418d-ad69-7d384f86652c\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.125313 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-nova-metadata-neutron-config-0\") pod \"e4ab62d3-2bec-418d-ad69-7d384f86652c\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.125353 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-ssh-key-openstack-edpm-ipam\") pod \"e4ab62d3-2bec-418d-ad69-7d384f86652c\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.125476 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e4ab62d3-2bec-418d-ad69-7d384f86652c\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.125523 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-neutron-metadata-combined-ca-bundle\") pod \"e4ab62d3-2bec-418d-ad69-7d384f86652c\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.125689 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmb2f\" (UniqueName: \"kubernetes.io/projected/e4ab62d3-2bec-418d-ad69-7d384f86652c-kube-api-access-mmb2f\") pod \"e4ab62d3-2bec-418d-ad69-7d384f86652c\" (UID: \"e4ab62d3-2bec-418d-ad69-7d384f86652c\") " Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.132197 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ab62d3-2bec-418d-ad69-7d384f86652c-kube-api-access-mmb2f" (OuterVolumeSpecName: "kube-api-access-mmb2f") pod "e4ab62d3-2bec-418d-ad69-7d384f86652c" (UID: "e4ab62d3-2bec-418d-ad69-7d384f86652c"). InnerVolumeSpecName "kube-api-access-mmb2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.132667 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e4ab62d3-2bec-418d-ad69-7d384f86652c" (UID: "e4ab62d3-2bec-418d-ad69-7d384f86652c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.153209 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e4ab62d3-2bec-418d-ad69-7d384f86652c" (UID: "e4ab62d3-2bec-418d-ad69-7d384f86652c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.161932 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e4ab62d3-2bec-418d-ad69-7d384f86652c" (UID: "e4ab62d3-2bec-418d-ad69-7d384f86652c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.167446 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-inventory" (OuterVolumeSpecName: "inventory") pod "e4ab62d3-2bec-418d-ad69-7d384f86652c" (UID: "e4ab62d3-2bec-418d-ad69-7d384f86652c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.168292 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e4ab62d3-2bec-418d-ad69-7d384f86652c" (UID: "e4ab62d3-2bec-418d-ad69-7d384f86652c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.228292 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmb2f\" (UniqueName: \"kubernetes.io/projected/e4ab62d3-2bec-418d-ad69-7d384f86652c-kube-api-access-mmb2f\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.228355 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.228370 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.228384 4717 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.228409 4717 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.228431 4717 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ab62d3-2bec-418d-ad69-7d384f86652c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.555643 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" event={"ID":"e4ab62d3-2bec-418d-ad69-7d384f86652c","Type":"ContainerDied","Data":"74b167f415599d228b2962d138bf759a27f4be6246c7410469740032dcc6c5dd"} Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.556584 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74b167f415599d228b2962d138bf759a27f4be6246c7410469740032dcc6c5dd" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.556347 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.663071 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t"] Feb 17 15:26:58 crc kubenswrapper[4717]: E0217 15:26:58.663724 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ab62d3-2bec-418d-ad69-7d384f86652c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.663758 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ab62d3-2bec-418d-ad69-7d384f86652c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.664319 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ab62d3-2bec-418d-ad69-7d384f86652c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.665345 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.668146 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.668285 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.668186 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.668408 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.668218 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.676223 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t"] Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.737273 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.737328 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.737480 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9f77\" (UniqueName: \"kubernetes.io/projected/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-kube-api-access-z9f77\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.737719 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.737758 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.839745 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.839812 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.839943 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.839999 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.840053 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9f77\" (UniqueName: \"kubernetes.io/projected/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-kube-api-access-z9f77\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.845849 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.846334 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.857627 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.857691 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.869604 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9f77\" (UniqueName: \"kubernetes.io/projected/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-kube-api-access-z9f77\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ts54t\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:58 crc kubenswrapper[4717]: I0217 15:26:58.992059 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:26:59 crc kubenswrapper[4717]: I0217 15:26:59.635391 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t"] Feb 17 15:26:59 crc kubenswrapper[4717]: W0217 15:26:59.644798 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfff26fba_baaf_4ed4_9c5b_b6dec300d19c.slice/crio-48081227ed954d50e94726b5578b4bb0f9b25a3ef103b1df7389d1c63140b09a WatchSource:0}: Error finding container 48081227ed954d50e94726b5578b4bb0f9b25a3ef103b1df7389d1c63140b09a: Status 404 returned error can't find the container with id 48081227ed954d50e94726b5578b4bb0f9b25a3ef103b1df7389d1c63140b09a Feb 17 15:27:00 crc kubenswrapper[4717]: I0217 15:27:00.584561 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" event={"ID":"fff26fba-baaf-4ed4-9c5b-b6dec300d19c","Type":"ContainerStarted","Data":"4d191b45697381bda5cfe783ddef5b77512237b20d9959a6b27316fade303e71"} Feb 17 15:27:00 crc kubenswrapper[4717]: I0217 15:27:00.585517 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" event={"ID":"fff26fba-baaf-4ed4-9c5b-b6dec300d19c","Type":"ContainerStarted","Data":"48081227ed954d50e94726b5578b4bb0f9b25a3ef103b1df7389d1c63140b09a"} Feb 17 15:27:00 crc kubenswrapper[4717]: I0217 15:27:00.608123 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" podStartSLOduration=2.174029742 podStartE2EDuration="2.608104023s" podCreationTimestamp="2026-02-17 15:26:58 +0000 UTC" firstStartedPulling="2026-02-17 15:26:59.647591621 +0000 UTC m=+2086.063432107" lastFinishedPulling="2026-02-17 15:27:00.081665902 +0000 UTC m=+2086.497506388" observedRunningTime="2026-02-17 15:27:00.600272411 +0000 UTC m=+2087.016112967" watchObservedRunningTime="2026-02-17 15:27:00.608104023 +0000 UTC m=+2087.023944499" Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.096675 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z72m5"] Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.101223 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.112884 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z72m5"] Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.241167 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxthb\" (UniqueName: \"kubernetes.io/projected/53a735d1-9c78-4551-9930-e23241c9e998-kube-api-access-zxthb\") pod \"redhat-operators-z72m5\" (UID: \"53a735d1-9c78-4551-9930-e23241c9e998\") " pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.241488 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53a735d1-9c78-4551-9930-e23241c9e998-utilities\") pod \"redhat-operators-z72m5\" (UID: \"53a735d1-9c78-4551-9930-e23241c9e998\") " pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.241890 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53a735d1-9c78-4551-9930-e23241c9e998-catalog-content\") pod \"redhat-operators-z72m5\" (UID: \"53a735d1-9c78-4551-9930-e23241c9e998\") " pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.343811 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53a735d1-9c78-4551-9930-e23241c9e998-catalog-content\") pod \"redhat-operators-z72m5\" (UID: \"53a735d1-9c78-4551-9930-e23241c9e998\") " pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.343883 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxthb\" (UniqueName: \"kubernetes.io/projected/53a735d1-9c78-4551-9930-e23241c9e998-kube-api-access-zxthb\") pod \"redhat-operators-z72m5\" (UID: \"53a735d1-9c78-4551-9930-e23241c9e998\") " pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.343936 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53a735d1-9c78-4551-9930-e23241c9e998-utilities\") pod \"redhat-operators-z72m5\" (UID: \"53a735d1-9c78-4551-9930-e23241c9e998\") " pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.344362 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53a735d1-9c78-4551-9930-e23241c9e998-utilities\") pod \"redhat-operators-z72m5\" (UID: \"53a735d1-9c78-4551-9930-e23241c9e998\") " pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.344645 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53a735d1-9c78-4551-9930-e23241c9e998-catalog-content\") pod \"redhat-operators-z72m5\" (UID: \"53a735d1-9c78-4551-9930-e23241c9e998\") " pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.365163 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxthb\" (UniqueName: \"kubernetes.io/projected/53a735d1-9c78-4551-9930-e23241c9e998-kube-api-access-zxthb\") pod \"redhat-operators-z72m5\" (UID: \"53a735d1-9c78-4551-9930-e23241c9e998\") " pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.447726 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.884072 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z72m5"] Feb 17 15:27:31 crc kubenswrapper[4717]: I0217 15:27:31.950943 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z72m5" event={"ID":"53a735d1-9c78-4551-9930-e23241c9e998","Type":"ContainerStarted","Data":"e5dd73560db17cc65ad25c872d3a61cc3741d9ab341770aafbe04d22041aea99"} Feb 17 15:27:32 crc kubenswrapper[4717]: I0217 15:27:32.964207 4717 generic.go:334] "Generic (PLEG): container finished" podID="53a735d1-9c78-4551-9930-e23241c9e998" containerID="40574ac34fefc43dc312f2a7fe8809d62a29639ae7295abea8a3ffb7c35ca79d" exitCode=0 Feb 17 15:27:32 crc kubenswrapper[4717]: I0217 15:27:32.964252 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z72m5" event={"ID":"53a735d1-9c78-4551-9930-e23241c9e998","Type":"ContainerDied","Data":"40574ac34fefc43dc312f2a7fe8809d62a29639ae7295abea8a3ffb7c35ca79d"} Feb 17 15:27:34 crc kubenswrapper[4717]: I0217 15:27:34.987520 4717 generic.go:334] "Generic (PLEG): container finished" podID="53a735d1-9c78-4551-9930-e23241c9e998" containerID="127832309cee262ce25598b9a1c7d41ca1b98127e52035ddd9b0ab8075b782dc" exitCode=0 Feb 17 15:27:34 crc kubenswrapper[4717]: I0217 15:27:34.987603 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z72m5" event={"ID":"53a735d1-9c78-4551-9930-e23241c9e998","Type":"ContainerDied","Data":"127832309cee262ce25598b9a1c7d41ca1b98127e52035ddd9b0ab8075b782dc"} Feb 17 15:27:36 crc kubenswrapper[4717]: I0217 15:27:36.004865 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z72m5" event={"ID":"53a735d1-9c78-4551-9930-e23241c9e998","Type":"ContainerStarted","Data":"5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4"} Feb 17 15:27:36 crc kubenswrapper[4717]: I0217 15:27:36.053758 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z72m5" podStartSLOduration=2.606844943 podStartE2EDuration="5.05373359s" podCreationTimestamp="2026-02-17 15:27:31 +0000 UTC" firstStartedPulling="2026-02-17 15:27:32.96714501 +0000 UTC m=+2119.382985486" lastFinishedPulling="2026-02-17 15:27:35.414033617 +0000 UTC m=+2121.829874133" observedRunningTime="2026-02-17 15:27:36.038115327 +0000 UTC m=+2122.453955843" watchObservedRunningTime="2026-02-17 15:27:36.05373359 +0000 UTC m=+2122.469574096" Feb 17 15:27:41 crc kubenswrapper[4717]: I0217 15:27:41.448506 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:41 crc kubenswrapper[4717]: I0217 15:27:41.449126 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:42 crc kubenswrapper[4717]: I0217 15:27:42.501461 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z72m5" podUID="53a735d1-9c78-4551-9930-e23241c9e998" containerName="registry-server" probeResult="failure" output=< Feb 17 15:27:42 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 17 15:27:42 crc kubenswrapper[4717]: > Feb 17 15:27:50 crc kubenswrapper[4717]: I0217 15:27:50.808218 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:27:50 crc kubenswrapper[4717]: I0217 15:27:50.808823 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:27:51 crc kubenswrapper[4717]: I0217 15:27:51.528634 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:51 crc kubenswrapper[4717]: I0217 15:27:51.586059 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:52 crc kubenswrapper[4717]: I0217 15:27:52.564875 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z72m5"] Feb 17 15:27:52 crc kubenswrapper[4717]: I0217 15:27:52.621029 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z72m5" podUID="53a735d1-9c78-4551-9930-e23241c9e998" containerName="registry-server" containerID="cri-o://5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4" gracePeriod=2 Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.124417 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.275943 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxthb\" (UniqueName: \"kubernetes.io/projected/53a735d1-9c78-4551-9930-e23241c9e998-kube-api-access-zxthb\") pod \"53a735d1-9c78-4551-9930-e23241c9e998\" (UID: \"53a735d1-9c78-4551-9930-e23241c9e998\") " Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.276115 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53a735d1-9c78-4551-9930-e23241c9e998-utilities\") pod \"53a735d1-9c78-4551-9930-e23241c9e998\" (UID: \"53a735d1-9c78-4551-9930-e23241c9e998\") " Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.276198 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53a735d1-9c78-4551-9930-e23241c9e998-catalog-content\") pod \"53a735d1-9c78-4551-9930-e23241c9e998\" (UID: \"53a735d1-9c78-4551-9930-e23241c9e998\") " Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.277463 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53a735d1-9c78-4551-9930-e23241c9e998-utilities" (OuterVolumeSpecName: "utilities") pod "53a735d1-9c78-4551-9930-e23241c9e998" (UID: "53a735d1-9c78-4551-9930-e23241c9e998"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.284529 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a735d1-9c78-4551-9930-e23241c9e998-kube-api-access-zxthb" (OuterVolumeSpecName: "kube-api-access-zxthb") pod "53a735d1-9c78-4551-9930-e23241c9e998" (UID: "53a735d1-9c78-4551-9930-e23241c9e998"). InnerVolumeSpecName "kube-api-access-zxthb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.378533 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxthb\" (UniqueName: \"kubernetes.io/projected/53a735d1-9c78-4551-9930-e23241c9e998-kube-api-access-zxthb\") on node \"crc\" DevicePath \"\"" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.378564 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53a735d1-9c78-4551-9930-e23241c9e998-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.416795 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53a735d1-9c78-4551-9930-e23241c9e998-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53a735d1-9c78-4551-9930-e23241c9e998" (UID: "53a735d1-9c78-4551-9930-e23241c9e998"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.479912 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53a735d1-9c78-4551-9930-e23241c9e998-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.634349 4717 generic.go:334] "Generic (PLEG): container finished" podID="53a735d1-9c78-4551-9930-e23241c9e998" containerID="5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4" exitCode=0 Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.634419 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z72m5" event={"ID":"53a735d1-9c78-4551-9930-e23241c9e998","Type":"ContainerDied","Data":"5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4"} Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.634469 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z72m5" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.634494 4717 scope.go:117] "RemoveContainer" containerID="5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.634476 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z72m5" event={"ID":"53a735d1-9c78-4551-9930-e23241c9e998","Type":"ContainerDied","Data":"e5dd73560db17cc65ad25c872d3a61cc3741d9ab341770aafbe04d22041aea99"} Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.665067 4717 scope.go:117] "RemoveContainer" containerID="127832309cee262ce25598b9a1c7d41ca1b98127e52035ddd9b0ab8075b782dc" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.698962 4717 scope.go:117] "RemoveContainer" containerID="40574ac34fefc43dc312f2a7fe8809d62a29639ae7295abea8a3ffb7c35ca79d" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.704595 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z72m5"] Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.716073 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z72m5"] Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.777519 4717 scope.go:117] "RemoveContainer" containerID="5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4" Feb 17 15:27:53 crc kubenswrapper[4717]: E0217 15:27:53.777954 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4\": container with ID starting with 5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4 not found: ID does not exist" containerID="5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.777983 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4"} err="failed to get container status \"5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4\": rpc error: code = NotFound desc = could not find container \"5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4\": container with ID starting with 5b91a02e4298e384763488ed00bea9a366778d15b0e19f712b70f486f81ec1c4 not found: ID does not exist" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.778004 4717 scope.go:117] "RemoveContainer" containerID="127832309cee262ce25598b9a1c7d41ca1b98127e52035ddd9b0ab8075b782dc" Feb 17 15:27:53 crc kubenswrapper[4717]: E0217 15:27:53.778319 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"127832309cee262ce25598b9a1c7d41ca1b98127e52035ddd9b0ab8075b782dc\": container with ID starting with 127832309cee262ce25598b9a1c7d41ca1b98127e52035ddd9b0ab8075b782dc not found: ID does not exist" containerID="127832309cee262ce25598b9a1c7d41ca1b98127e52035ddd9b0ab8075b782dc" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.778366 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"127832309cee262ce25598b9a1c7d41ca1b98127e52035ddd9b0ab8075b782dc"} err="failed to get container status \"127832309cee262ce25598b9a1c7d41ca1b98127e52035ddd9b0ab8075b782dc\": rpc error: code = NotFound desc = could not find container \"127832309cee262ce25598b9a1c7d41ca1b98127e52035ddd9b0ab8075b782dc\": container with ID starting with 127832309cee262ce25598b9a1c7d41ca1b98127e52035ddd9b0ab8075b782dc not found: ID does not exist" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.778397 4717 scope.go:117] "RemoveContainer" containerID="40574ac34fefc43dc312f2a7fe8809d62a29639ae7295abea8a3ffb7c35ca79d" Feb 17 15:27:53 crc kubenswrapper[4717]: E0217 15:27:53.778683 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40574ac34fefc43dc312f2a7fe8809d62a29639ae7295abea8a3ffb7c35ca79d\": container with ID starting with 40574ac34fefc43dc312f2a7fe8809d62a29639ae7295abea8a3ffb7c35ca79d not found: ID does not exist" containerID="40574ac34fefc43dc312f2a7fe8809d62a29639ae7295abea8a3ffb7c35ca79d" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.778711 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40574ac34fefc43dc312f2a7fe8809d62a29639ae7295abea8a3ffb7c35ca79d"} err="failed to get container status \"40574ac34fefc43dc312f2a7fe8809d62a29639ae7295abea8a3ffb7c35ca79d\": rpc error: code = NotFound desc = could not find container \"40574ac34fefc43dc312f2a7fe8809d62a29639ae7295abea8a3ffb7c35ca79d\": container with ID starting with 40574ac34fefc43dc312f2a7fe8809d62a29639ae7295abea8a3ffb7c35ca79d not found: ID does not exist" Feb 17 15:27:53 crc kubenswrapper[4717]: I0217 15:27:53.859328 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53a735d1-9c78-4551-9930-e23241c9e998" path="/var/lib/kubelet/pods/53a735d1-9c78-4551-9930-e23241c9e998/volumes" Feb 17 15:28:20 crc kubenswrapper[4717]: I0217 15:28:20.807847 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:28:20 crc kubenswrapper[4717]: I0217 15:28:20.808411 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.204252 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sjwnb"] Feb 17 15:28:39 crc kubenswrapper[4717]: E0217 15:28:39.205538 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a735d1-9c78-4551-9930-e23241c9e998" containerName="registry-server" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.205564 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a735d1-9c78-4551-9930-e23241c9e998" containerName="registry-server" Feb 17 15:28:39 crc kubenswrapper[4717]: E0217 15:28:39.205609 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a735d1-9c78-4551-9930-e23241c9e998" containerName="extract-utilities" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.205624 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a735d1-9c78-4551-9930-e23241c9e998" containerName="extract-utilities" Feb 17 15:28:39 crc kubenswrapper[4717]: E0217 15:28:39.205657 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a735d1-9c78-4551-9930-e23241c9e998" containerName="extract-content" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.205670 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a735d1-9c78-4551-9930-e23241c9e998" containerName="extract-content" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.206143 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a735d1-9c78-4551-9930-e23241c9e998" containerName="registry-server" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.208473 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.259860 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjwnb"] Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.347445 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4txm4\" (UniqueName: \"kubernetes.io/projected/b68547fb-3fea-4f95-804f-74d53e50bc33-kube-api-access-4txm4\") pod \"redhat-marketplace-sjwnb\" (UID: \"b68547fb-3fea-4f95-804f-74d53e50bc33\") " pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.347732 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b68547fb-3fea-4f95-804f-74d53e50bc33-catalog-content\") pod \"redhat-marketplace-sjwnb\" (UID: \"b68547fb-3fea-4f95-804f-74d53e50bc33\") " pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.347824 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b68547fb-3fea-4f95-804f-74d53e50bc33-utilities\") pod \"redhat-marketplace-sjwnb\" (UID: \"b68547fb-3fea-4f95-804f-74d53e50bc33\") " pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.450357 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4txm4\" (UniqueName: \"kubernetes.io/projected/b68547fb-3fea-4f95-804f-74d53e50bc33-kube-api-access-4txm4\") pod \"redhat-marketplace-sjwnb\" (UID: \"b68547fb-3fea-4f95-804f-74d53e50bc33\") " pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.450423 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b68547fb-3fea-4f95-804f-74d53e50bc33-catalog-content\") pod \"redhat-marketplace-sjwnb\" (UID: \"b68547fb-3fea-4f95-804f-74d53e50bc33\") " pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.450454 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b68547fb-3fea-4f95-804f-74d53e50bc33-utilities\") pod \"redhat-marketplace-sjwnb\" (UID: \"b68547fb-3fea-4f95-804f-74d53e50bc33\") " pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.451042 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b68547fb-3fea-4f95-804f-74d53e50bc33-utilities\") pod \"redhat-marketplace-sjwnb\" (UID: \"b68547fb-3fea-4f95-804f-74d53e50bc33\") " pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.451292 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b68547fb-3fea-4f95-804f-74d53e50bc33-catalog-content\") pod \"redhat-marketplace-sjwnb\" (UID: \"b68547fb-3fea-4f95-804f-74d53e50bc33\") " pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.473581 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4txm4\" (UniqueName: \"kubernetes.io/projected/b68547fb-3fea-4f95-804f-74d53e50bc33-kube-api-access-4txm4\") pod \"redhat-marketplace-sjwnb\" (UID: \"b68547fb-3fea-4f95-804f-74d53e50bc33\") " pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:39 crc kubenswrapper[4717]: I0217 15:28:39.544953 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:40 crc kubenswrapper[4717]: I0217 15:28:40.040006 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjwnb"] Feb 17 15:28:40 crc kubenswrapper[4717]: I0217 15:28:40.118819 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjwnb" event={"ID":"b68547fb-3fea-4f95-804f-74d53e50bc33","Type":"ContainerStarted","Data":"468bf18d11ac02233c64f61c93570642d3c7ef96a3e62dc158e59102b7a81bac"} Feb 17 15:28:41 crc kubenswrapper[4717]: I0217 15:28:41.130893 4717 generic.go:334] "Generic (PLEG): container finished" podID="b68547fb-3fea-4f95-804f-74d53e50bc33" containerID="58d3265760c79b86d71317aad10cbc6f80f1aa4baccba312e590330d7fcaf696" exitCode=0 Feb 17 15:28:41 crc kubenswrapper[4717]: I0217 15:28:41.131069 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjwnb" event={"ID":"b68547fb-3fea-4f95-804f-74d53e50bc33","Type":"ContainerDied","Data":"58d3265760c79b86d71317aad10cbc6f80f1aa4baccba312e590330d7fcaf696"} Feb 17 15:28:42 crc kubenswrapper[4717]: I0217 15:28:42.143480 4717 generic.go:334] "Generic (PLEG): container finished" podID="b68547fb-3fea-4f95-804f-74d53e50bc33" containerID="b0537c09332e47319fab17f6739cb55d39304f076db9080f5527d293cd5f35c9" exitCode=0 Feb 17 15:28:42 crc kubenswrapper[4717]: I0217 15:28:42.143582 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjwnb" event={"ID":"b68547fb-3fea-4f95-804f-74d53e50bc33","Type":"ContainerDied","Data":"b0537c09332e47319fab17f6739cb55d39304f076db9080f5527d293cd5f35c9"} Feb 17 15:28:43 crc kubenswrapper[4717]: I0217 15:28:43.159050 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjwnb" event={"ID":"b68547fb-3fea-4f95-804f-74d53e50bc33","Type":"ContainerStarted","Data":"41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4"} Feb 17 15:28:43 crc kubenswrapper[4717]: I0217 15:28:43.192426 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sjwnb" podStartSLOduration=2.444861335 podStartE2EDuration="4.192401008s" podCreationTimestamp="2026-02-17 15:28:39 +0000 UTC" firstStartedPulling="2026-02-17 15:28:41.133519193 +0000 UTC m=+2187.549359679" lastFinishedPulling="2026-02-17 15:28:42.881058866 +0000 UTC m=+2189.296899352" observedRunningTime="2026-02-17 15:28:43.183172067 +0000 UTC m=+2189.599012553" watchObservedRunningTime="2026-02-17 15:28:43.192401008 +0000 UTC m=+2189.608241494" Feb 17 15:28:49 crc kubenswrapper[4717]: I0217 15:28:49.545704 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:49 crc kubenswrapper[4717]: I0217 15:28:49.547652 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:49 crc kubenswrapper[4717]: I0217 15:28:49.614211 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:50 crc kubenswrapper[4717]: I0217 15:28:50.494695 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:50 crc kubenswrapper[4717]: I0217 15:28:50.548400 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjwnb"] Feb 17 15:28:50 crc kubenswrapper[4717]: I0217 15:28:50.808380 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:28:50 crc kubenswrapper[4717]: I0217 15:28:50.808777 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:28:50 crc kubenswrapper[4717]: I0217 15:28:50.808850 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 15:28:50 crc kubenswrapper[4717]: I0217 15:28:50.810040 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2ddaca91157a7f3cebc0d5a4db9348eb8786a6e148854bda086735851b9900e"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:28:50 crc kubenswrapper[4717]: I0217 15:28:50.810219 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://f2ddaca91157a7f3cebc0d5a4db9348eb8786a6e148854bda086735851b9900e" gracePeriod=600 Feb 17 15:28:51 crc kubenswrapper[4717]: I0217 15:28:51.444050 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="f2ddaca91157a7f3cebc0d5a4db9348eb8786a6e148854bda086735851b9900e" exitCode=0 Feb 17 15:28:51 crc kubenswrapper[4717]: I0217 15:28:51.444115 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"f2ddaca91157a7f3cebc0d5a4db9348eb8786a6e148854bda086735851b9900e"} Feb 17 15:28:51 crc kubenswrapper[4717]: I0217 15:28:51.444425 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071"} Feb 17 15:28:51 crc kubenswrapper[4717]: I0217 15:28:51.444445 4717 scope.go:117] "RemoveContainer" containerID="507da7bce1bfddf66d1adeb9b81699a690f109dbde7eb53ae6ebd553bafafeea" Feb 17 15:28:52 crc kubenswrapper[4717]: I0217 15:28:52.461815 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sjwnb" podUID="b68547fb-3fea-4f95-804f-74d53e50bc33" containerName="registry-server" containerID="cri-o://41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4" gracePeriod=2 Feb 17 15:28:52 crc kubenswrapper[4717]: I0217 15:28:52.928425 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.044727 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b68547fb-3fea-4f95-804f-74d53e50bc33-catalog-content\") pod \"b68547fb-3fea-4f95-804f-74d53e50bc33\" (UID: \"b68547fb-3fea-4f95-804f-74d53e50bc33\") " Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.045056 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b68547fb-3fea-4f95-804f-74d53e50bc33-utilities\") pod \"b68547fb-3fea-4f95-804f-74d53e50bc33\" (UID: \"b68547fb-3fea-4f95-804f-74d53e50bc33\") " Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.045166 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4txm4\" (UniqueName: \"kubernetes.io/projected/b68547fb-3fea-4f95-804f-74d53e50bc33-kube-api-access-4txm4\") pod \"b68547fb-3fea-4f95-804f-74d53e50bc33\" (UID: \"b68547fb-3fea-4f95-804f-74d53e50bc33\") " Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.045981 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b68547fb-3fea-4f95-804f-74d53e50bc33-utilities" (OuterVolumeSpecName: "utilities") pod "b68547fb-3fea-4f95-804f-74d53e50bc33" (UID: "b68547fb-3fea-4f95-804f-74d53e50bc33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.052439 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b68547fb-3fea-4f95-804f-74d53e50bc33-kube-api-access-4txm4" (OuterVolumeSpecName: "kube-api-access-4txm4") pod "b68547fb-3fea-4f95-804f-74d53e50bc33" (UID: "b68547fb-3fea-4f95-804f-74d53e50bc33"). InnerVolumeSpecName "kube-api-access-4txm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.069768 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b68547fb-3fea-4f95-804f-74d53e50bc33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b68547fb-3fea-4f95-804f-74d53e50bc33" (UID: "b68547fb-3fea-4f95-804f-74d53e50bc33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.146937 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b68547fb-3fea-4f95-804f-74d53e50bc33-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.147163 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b68547fb-3fea-4f95-804f-74d53e50bc33-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.147263 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4txm4\" (UniqueName: \"kubernetes.io/projected/b68547fb-3fea-4f95-804f-74d53e50bc33-kube-api-access-4txm4\") on node \"crc\" DevicePath \"\"" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.472455 4717 generic.go:334] "Generic (PLEG): container finished" podID="b68547fb-3fea-4f95-804f-74d53e50bc33" containerID="41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4" exitCode=0 Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.472526 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjwnb" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.472517 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjwnb" event={"ID":"b68547fb-3fea-4f95-804f-74d53e50bc33","Type":"ContainerDied","Data":"41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4"} Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.472680 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjwnb" event={"ID":"b68547fb-3fea-4f95-804f-74d53e50bc33","Type":"ContainerDied","Data":"468bf18d11ac02233c64f61c93570642d3c7ef96a3e62dc158e59102b7a81bac"} Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.472710 4717 scope.go:117] "RemoveContainer" containerID="41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.493030 4717 scope.go:117] "RemoveContainer" containerID="b0537c09332e47319fab17f6739cb55d39304f076db9080f5527d293cd5f35c9" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.511553 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjwnb"] Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.521806 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjwnb"] Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.528641 4717 scope.go:117] "RemoveContainer" containerID="58d3265760c79b86d71317aad10cbc6f80f1aa4baccba312e590330d7fcaf696" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.561437 4717 scope.go:117] "RemoveContainer" containerID="41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4" Feb 17 15:28:53 crc kubenswrapper[4717]: E0217 15:28:53.562074 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4\": container with ID starting with 41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4 not found: ID does not exist" containerID="41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.562146 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4"} err="failed to get container status \"41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4\": rpc error: code = NotFound desc = could not find container \"41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4\": container with ID starting with 41530b7d1b780a0c65da8463c289a3b8bb1496360ed6b9c832e725eda0109ed4 not found: ID does not exist" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.562197 4717 scope.go:117] "RemoveContainer" containerID="b0537c09332e47319fab17f6739cb55d39304f076db9080f5527d293cd5f35c9" Feb 17 15:28:53 crc kubenswrapper[4717]: E0217 15:28:53.562576 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0537c09332e47319fab17f6739cb55d39304f076db9080f5527d293cd5f35c9\": container with ID starting with b0537c09332e47319fab17f6739cb55d39304f076db9080f5527d293cd5f35c9 not found: ID does not exist" containerID="b0537c09332e47319fab17f6739cb55d39304f076db9080f5527d293cd5f35c9" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.562612 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0537c09332e47319fab17f6739cb55d39304f076db9080f5527d293cd5f35c9"} err="failed to get container status \"b0537c09332e47319fab17f6739cb55d39304f076db9080f5527d293cd5f35c9\": rpc error: code = NotFound desc = could not find container \"b0537c09332e47319fab17f6739cb55d39304f076db9080f5527d293cd5f35c9\": container with ID starting with b0537c09332e47319fab17f6739cb55d39304f076db9080f5527d293cd5f35c9 not found: ID does not exist" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.562631 4717 scope.go:117] "RemoveContainer" containerID="58d3265760c79b86d71317aad10cbc6f80f1aa4baccba312e590330d7fcaf696" Feb 17 15:28:53 crc kubenswrapper[4717]: E0217 15:28:53.563012 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d3265760c79b86d71317aad10cbc6f80f1aa4baccba312e590330d7fcaf696\": container with ID starting with 58d3265760c79b86d71317aad10cbc6f80f1aa4baccba312e590330d7fcaf696 not found: ID does not exist" containerID="58d3265760c79b86d71317aad10cbc6f80f1aa4baccba312e590330d7fcaf696" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.563032 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d3265760c79b86d71317aad10cbc6f80f1aa4baccba312e590330d7fcaf696"} err="failed to get container status \"58d3265760c79b86d71317aad10cbc6f80f1aa4baccba312e590330d7fcaf696\": rpc error: code = NotFound desc = could not find container \"58d3265760c79b86d71317aad10cbc6f80f1aa4baccba312e590330d7fcaf696\": container with ID starting with 58d3265760c79b86d71317aad10cbc6f80f1aa4baccba312e590330d7fcaf696 not found: ID does not exist" Feb 17 15:28:53 crc kubenswrapper[4717]: I0217 15:28:53.858243 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b68547fb-3fea-4f95-804f-74d53e50bc33" path="/var/lib/kubelet/pods/b68547fb-3fea-4f95-804f-74d53e50bc33/volumes" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.162314 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj"] Feb 17 15:30:00 crc kubenswrapper[4717]: E0217 15:30:00.163505 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68547fb-3fea-4f95-804f-74d53e50bc33" containerName="extract-utilities" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.163526 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68547fb-3fea-4f95-804f-74d53e50bc33" containerName="extract-utilities" Feb 17 15:30:00 crc kubenswrapper[4717]: E0217 15:30:00.163539 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68547fb-3fea-4f95-804f-74d53e50bc33" containerName="extract-content" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.163545 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68547fb-3fea-4f95-804f-74d53e50bc33" containerName="extract-content" Feb 17 15:30:00 crc kubenswrapper[4717]: E0217 15:30:00.163565 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68547fb-3fea-4f95-804f-74d53e50bc33" containerName="registry-server" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.163574 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68547fb-3fea-4f95-804f-74d53e50bc33" containerName="registry-server" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.163832 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b68547fb-3fea-4f95-804f-74d53e50bc33" containerName="registry-server" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.164591 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.172925 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.173254 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.231805 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj"] Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.314968 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/974a7e82-05f0-4451-8848-de1d6d51916b-config-volume\") pod \"collect-profiles-29522370-bs6nj\" (UID: \"974a7e82-05f0-4451-8848-de1d6d51916b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.315116 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/974a7e82-05f0-4451-8848-de1d6d51916b-secret-volume\") pod \"collect-profiles-29522370-bs6nj\" (UID: \"974a7e82-05f0-4451-8848-de1d6d51916b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.315339 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djj46\" (UniqueName: \"kubernetes.io/projected/974a7e82-05f0-4451-8848-de1d6d51916b-kube-api-access-djj46\") pod \"collect-profiles-29522370-bs6nj\" (UID: \"974a7e82-05f0-4451-8848-de1d6d51916b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.416903 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djj46\" (UniqueName: \"kubernetes.io/projected/974a7e82-05f0-4451-8848-de1d6d51916b-kube-api-access-djj46\") pod \"collect-profiles-29522370-bs6nj\" (UID: \"974a7e82-05f0-4451-8848-de1d6d51916b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.416987 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/974a7e82-05f0-4451-8848-de1d6d51916b-config-volume\") pod \"collect-profiles-29522370-bs6nj\" (UID: \"974a7e82-05f0-4451-8848-de1d6d51916b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.417035 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/974a7e82-05f0-4451-8848-de1d6d51916b-secret-volume\") pod \"collect-profiles-29522370-bs6nj\" (UID: \"974a7e82-05f0-4451-8848-de1d6d51916b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.418303 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/974a7e82-05f0-4451-8848-de1d6d51916b-config-volume\") pod \"collect-profiles-29522370-bs6nj\" (UID: \"974a7e82-05f0-4451-8848-de1d6d51916b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.426058 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/974a7e82-05f0-4451-8848-de1d6d51916b-secret-volume\") pod \"collect-profiles-29522370-bs6nj\" (UID: \"974a7e82-05f0-4451-8848-de1d6d51916b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.446605 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djj46\" (UniqueName: \"kubernetes.io/projected/974a7e82-05f0-4451-8848-de1d6d51916b-kube-api-access-djj46\") pod \"collect-profiles-29522370-bs6nj\" (UID: \"974a7e82-05f0-4451-8848-de1d6d51916b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.524148 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:00 crc kubenswrapper[4717]: I0217 15:30:00.990961 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj"] Feb 17 15:30:00 crc kubenswrapper[4717]: W0217 15:30:00.999743 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod974a7e82_05f0_4451_8848_de1d6d51916b.slice/crio-89fb7c4f69ed31e24a1aa21344c6e73aa89d06142f306aee99234f6638514fd0 WatchSource:0}: Error finding container 89fb7c4f69ed31e24a1aa21344c6e73aa89d06142f306aee99234f6638514fd0: Status 404 returned error can't find the container with id 89fb7c4f69ed31e24a1aa21344c6e73aa89d06142f306aee99234f6638514fd0 Feb 17 15:30:01 crc kubenswrapper[4717]: I0217 15:30:01.193532 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" event={"ID":"974a7e82-05f0-4451-8848-de1d6d51916b","Type":"ContainerStarted","Data":"89fb7c4f69ed31e24a1aa21344c6e73aa89d06142f306aee99234f6638514fd0"} Feb 17 15:30:02 crc kubenswrapper[4717]: I0217 15:30:02.211948 4717 generic.go:334] "Generic (PLEG): container finished" podID="974a7e82-05f0-4451-8848-de1d6d51916b" containerID="69c140071c11d4087262d739740cfb6d968fa25140a2c664cc93d7f370054a0a" exitCode=0 Feb 17 15:30:02 crc kubenswrapper[4717]: I0217 15:30:02.212020 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" event={"ID":"974a7e82-05f0-4451-8848-de1d6d51916b","Type":"ContainerDied","Data":"69c140071c11d4087262d739740cfb6d968fa25140a2c664cc93d7f370054a0a"} Feb 17 15:30:03 crc kubenswrapper[4717]: I0217 15:30:03.623236 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:03 crc kubenswrapper[4717]: I0217 15:30:03.682140 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djj46\" (UniqueName: \"kubernetes.io/projected/974a7e82-05f0-4451-8848-de1d6d51916b-kube-api-access-djj46\") pod \"974a7e82-05f0-4451-8848-de1d6d51916b\" (UID: \"974a7e82-05f0-4451-8848-de1d6d51916b\") " Feb 17 15:30:03 crc kubenswrapper[4717]: I0217 15:30:03.682619 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/974a7e82-05f0-4451-8848-de1d6d51916b-secret-volume\") pod \"974a7e82-05f0-4451-8848-de1d6d51916b\" (UID: \"974a7e82-05f0-4451-8848-de1d6d51916b\") " Feb 17 15:30:03 crc kubenswrapper[4717]: I0217 15:30:03.682798 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/974a7e82-05f0-4451-8848-de1d6d51916b-config-volume\") pod \"974a7e82-05f0-4451-8848-de1d6d51916b\" (UID: \"974a7e82-05f0-4451-8848-de1d6d51916b\") " Feb 17 15:30:03 crc kubenswrapper[4717]: I0217 15:30:03.683759 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974a7e82-05f0-4451-8848-de1d6d51916b-config-volume" (OuterVolumeSpecName: "config-volume") pod "974a7e82-05f0-4451-8848-de1d6d51916b" (UID: "974a7e82-05f0-4451-8848-de1d6d51916b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:30:03 crc kubenswrapper[4717]: I0217 15:30:03.689914 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/974a7e82-05f0-4451-8848-de1d6d51916b-kube-api-access-djj46" (OuterVolumeSpecName: "kube-api-access-djj46") pod "974a7e82-05f0-4451-8848-de1d6d51916b" (UID: "974a7e82-05f0-4451-8848-de1d6d51916b"). InnerVolumeSpecName "kube-api-access-djj46". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:30:03 crc kubenswrapper[4717]: I0217 15:30:03.691744 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974a7e82-05f0-4451-8848-de1d6d51916b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "974a7e82-05f0-4451-8848-de1d6d51916b" (UID: "974a7e82-05f0-4451-8848-de1d6d51916b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:30:03 crc kubenswrapper[4717]: I0217 15:30:03.784227 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djj46\" (UniqueName: \"kubernetes.io/projected/974a7e82-05f0-4451-8848-de1d6d51916b-kube-api-access-djj46\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:03 crc kubenswrapper[4717]: I0217 15:30:03.784283 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/974a7e82-05f0-4451-8848-de1d6d51916b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:03 crc kubenswrapper[4717]: I0217 15:30:03.784302 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/974a7e82-05f0-4451-8848-de1d6d51916b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:04 crc kubenswrapper[4717]: I0217 15:30:04.240430 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" event={"ID":"974a7e82-05f0-4451-8848-de1d6d51916b","Type":"ContainerDied","Data":"89fb7c4f69ed31e24a1aa21344c6e73aa89d06142f306aee99234f6638514fd0"} Feb 17 15:30:04 crc kubenswrapper[4717]: I0217 15:30:04.240510 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89fb7c4f69ed31e24a1aa21344c6e73aa89d06142f306aee99234f6638514fd0" Feb 17 15:30:04 crc kubenswrapper[4717]: I0217 15:30:04.240521 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-bs6nj" Feb 17 15:30:04 crc kubenswrapper[4717]: I0217 15:30:04.741133 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6"] Feb 17 15:30:04 crc kubenswrapper[4717]: I0217 15:30:04.754369 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522325-45ck6"] Feb 17 15:30:05 crc kubenswrapper[4717]: I0217 15:30:05.866562 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a" path="/var/lib/kubelet/pods/d8a3fc6e-7e2e-4cc4-bbfe-c16e1299995a/volumes" Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.799650 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-phq8j"] Feb 17 15:30:20 crc kubenswrapper[4717]: E0217 15:30:20.802503 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974a7e82-05f0-4451-8848-de1d6d51916b" containerName="collect-profiles" Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.802659 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="974a7e82-05f0-4451-8848-de1d6d51916b" containerName="collect-profiles" Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.803174 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="974a7e82-05f0-4451-8848-de1d6d51916b" containerName="collect-profiles" Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.806786 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.812606 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-phq8j"] Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.889866 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssrkl\" (UniqueName: \"kubernetes.io/projected/d783fc4f-ae86-4b01-8cdc-b7e40253f429-kube-api-access-ssrkl\") pod \"certified-operators-phq8j\" (UID: \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\") " pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.889915 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d783fc4f-ae86-4b01-8cdc-b7e40253f429-utilities\") pod \"certified-operators-phq8j\" (UID: \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\") " pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.889956 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d783fc4f-ae86-4b01-8cdc-b7e40253f429-catalog-content\") pod \"certified-operators-phq8j\" (UID: \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\") " pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.992342 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssrkl\" (UniqueName: \"kubernetes.io/projected/d783fc4f-ae86-4b01-8cdc-b7e40253f429-kube-api-access-ssrkl\") pod \"certified-operators-phq8j\" (UID: \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\") " pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.992397 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d783fc4f-ae86-4b01-8cdc-b7e40253f429-utilities\") pod \"certified-operators-phq8j\" (UID: \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\") " pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.992447 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d783fc4f-ae86-4b01-8cdc-b7e40253f429-catalog-content\") pod \"certified-operators-phq8j\" (UID: \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\") " pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.995200 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d783fc4f-ae86-4b01-8cdc-b7e40253f429-catalog-content\") pod \"certified-operators-phq8j\" (UID: \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\") " pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:20 crc kubenswrapper[4717]: I0217 15:30:20.995210 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d783fc4f-ae86-4b01-8cdc-b7e40253f429-utilities\") pod \"certified-operators-phq8j\" (UID: \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\") " pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:21 crc kubenswrapper[4717]: I0217 15:30:21.019057 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssrkl\" (UniqueName: \"kubernetes.io/projected/d783fc4f-ae86-4b01-8cdc-b7e40253f429-kube-api-access-ssrkl\") pod \"certified-operators-phq8j\" (UID: \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\") " pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:21 crc kubenswrapper[4717]: I0217 15:30:21.141878 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:21 crc kubenswrapper[4717]: I0217 15:30:21.649821 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-phq8j"] Feb 17 15:30:22 crc kubenswrapper[4717]: I0217 15:30:22.464147 4717 generic.go:334] "Generic (PLEG): container finished" podID="d783fc4f-ae86-4b01-8cdc-b7e40253f429" containerID="173cff959f98d46c826066f68723fa137378f31c8ad80de186333bd881fdd54f" exitCode=0 Feb 17 15:30:22 crc kubenswrapper[4717]: I0217 15:30:22.464369 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phq8j" event={"ID":"d783fc4f-ae86-4b01-8cdc-b7e40253f429","Type":"ContainerDied","Data":"173cff959f98d46c826066f68723fa137378f31c8ad80de186333bd881fdd54f"} Feb 17 15:30:22 crc kubenswrapper[4717]: I0217 15:30:22.464473 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phq8j" event={"ID":"d783fc4f-ae86-4b01-8cdc-b7e40253f429","Type":"ContainerStarted","Data":"fed995b71b23bc1b0134a1dece7fad4433d9e1615e9880be5dff2d2dd5b6e34b"} Feb 17 15:30:22 crc kubenswrapper[4717]: I0217 15:30:22.468384 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:30:23 crc kubenswrapper[4717]: I0217 15:30:23.480212 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phq8j" event={"ID":"d783fc4f-ae86-4b01-8cdc-b7e40253f429","Type":"ContainerStarted","Data":"dc07a8a7bca405b1bc82664e1edfbac630cb593dc7ef8c1f7f31cd1bd7771529"} Feb 17 15:30:24 crc kubenswrapper[4717]: I0217 15:30:24.494344 4717 generic.go:334] "Generic (PLEG): container finished" podID="d783fc4f-ae86-4b01-8cdc-b7e40253f429" containerID="dc07a8a7bca405b1bc82664e1edfbac630cb593dc7ef8c1f7f31cd1bd7771529" exitCode=0 Feb 17 15:30:24 crc kubenswrapper[4717]: I0217 15:30:24.494591 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phq8j" event={"ID":"d783fc4f-ae86-4b01-8cdc-b7e40253f429","Type":"ContainerDied","Data":"dc07a8a7bca405b1bc82664e1edfbac630cb593dc7ef8c1f7f31cd1bd7771529"} Feb 17 15:30:25 crc kubenswrapper[4717]: I0217 15:30:25.509407 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phq8j" event={"ID":"d783fc4f-ae86-4b01-8cdc-b7e40253f429","Type":"ContainerStarted","Data":"ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1"} Feb 17 15:30:25 crc kubenswrapper[4717]: I0217 15:30:25.543006 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-phq8j" podStartSLOduration=3.104606705 podStartE2EDuration="5.54298906s" podCreationTimestamp="2026-02-17 15:30:20 +0000 UTC" firstStartedPulling="2026-02-17 15:30:22.467923376 +0000 UTC m=+2288.883763892" lastFinishedPulling="2026-02-17 15:30:24.906305751 +0000 UTC m=+2291.322146247" observedRunningTime="2026-02-17 15:30:25.537556276 +0000 UTC m=+2291.953396762" watchObservedRunningTime="2026-02-17 15:30:25.54298906 +0000 UTC m=+2291.958829536" Feb 17 15:30:31 crc kubenswrapper[4717]: I0217 15:30:31.142631 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:31 crc kubenswrapper[4717]: I0217 15:30:31.144018 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:31 crc kubenswrapper[4717]: I0217 15:30:31.231520 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:31 crc kubenswrapper[4717]: I0217 15:30:31.658386 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:31 crc kubenswrapper[4717]: I0217 15:30:31.731049 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-phq8j"] Feb 17 15:30:33 crc kubenswrapper[4717]: I0217 15:30:33.594729 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-phq8j" podUID="d783fc4f-ae86-4b01-8cdc-b7e40253f429" containerName="registry-server" containerID="cri-o://ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1" gracePeriod=2 Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.046892 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.161953 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d783fc4f-ae86-4b01-8cdc-b7e40253f429-utilities\") pod \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\" (UID: \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\") " Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.162032 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssrkl\" (UniqueName: \"kubernetes.io/projected/d783fc4f-ae86-4b01-8cdc-b7e40253f429-kube-api-access-ssrkl\") pod \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\" (UID: \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\") " Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.162128 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d783fc4f-ae86-4b01-8cdc-b7e40253f429-catalog-content\") pod \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\" (UID: \"d783fc4f-ae86-4b01-8cdc-b7e40253f429\") " Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.163617 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d783fc4f-ae86-4b01-8cdc-b7e40253f429-utilities" (OuterVolumeSpecName: "utilities") pod "d783fc4f-ae86-4b01-8cdc-b7e40253f429" (UID: "d783fc4f-ae86-4b01-8cdc-b7e40253f429"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.168312 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d783fc4f-ae86-4b01-8cdc-b7e40253f429-kube-api-access-ssrkl" (OuterVolumeSpecName: "kube-api-access-ssrkl") pod "d783fc4f-ae86-4b01-8cdc-b7e40253f429" (UID: "d783fc4f-ae86-4b01-8cdc-b7e40253f429"). InnerVolumeSpecName "kube-api-access-ssrkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.218885 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d783fc4f-ae86-4b01-8cdc-b7e40253f429-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d783fc4f-ae86-4b01-8cdc-b7e40253f429" (UID: "d783fc4f-ae86-4b01-8cdc-b7e40253f429"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.263863 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d783fc4f-ae86-4b01-8cdc-b7e40253f429-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.263902 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssrkl\" (UniqueName: \"kubernetes.io/projected/d783fc4f-ae86-4b01-8cdc-b7e40253f429-kube-api-access-ssrkl\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.263922 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d783fc4f-ae86-4b01-8cdc-b7e40253f429-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.608655 4717 generic.go:334] "Generic (PLEG): container finished" podID="d783fc4f-ae86-4b01-8cdc-b7e40253f429" containerID="ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1" exitCode=0 Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.608734 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phq8j" event={"ID":"d783fc4f-ae86-4b01-8cdc-b7e40253f429","Type":"ContainerDied","Data":"ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1"} Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.608994 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phq8j" event={"ID":"d783fc4f-ae86-4b01-8cdc-b7e40253f429","Type":"ContainerDied","Data":"fed995b71b23bc1b0134a1dece7fad4433d9e1615e9880be5dff2d2dd5b6e34b"} Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.609029 4717 scope.go:117] "RemoveContainer" containerID="ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.608759 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phq8j" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.645392 4717 scope.go:117] "RemoveContainer" containerID="dc07a8a7bca405b1bc82664e1edfbac630cb593dc7ef8c1f7f31cd1bd7771529" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.654689 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-phq8j"] Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.663696 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-phq8j"] Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.688800 4717 scope.go:117] "RemoveContainer" containerID="173cff959f98d46c826066f68723fa137378f31c8ad80de186333bd881fdd54f" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.734793 4717 scope.go:117] "RemoveContainer" containerID="ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1" Feb 17 15:30:34 crc kubenswrapper[4717]: E0217 15:30:34.736008 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1\": container with ID starting with ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1 not found: ID does not exist" containerID="ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.736040 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1"} err="failed to get container status \"ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1\": rpc error: code = NotFound desc = could not find container \"ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1\": container with ID starting with ea13f713752990a63d61a4018972328ea18823811e4208fbb3c1029ba77a32d1 not found: ID does not exist" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.736060 4717 scope.go:117] "RemoveContainer" containerID="dc07a8a7bca405b1bc82664e1edfbac630cb593dc7ef8c1f7f31cd1bd7771529" Feb 17 15:30:34 crc kubenswrapper[4717]: E0217 15:30:34.736332 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc07a8a7bca405b1bc82664e1edfbac630cb593dc7ef8c1f7f31cd1bd7771529\": container with ID starting with dc07a8a7bca405b1bc82664e1edfbac630cb593dc7ef8c1f7f31cd1bd7771529 not found: ID does not exist" containerID="dc07a8a7bca405b1bc82664e1edfbac630cb593dc7ef8c1f7f31cd1bd7771529" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.736352 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc07a8a7bca405b1bc82664e1edfbac630cb593dc7ef8c1f7f31cd1bd7771529"} err="failed to get container status \"dc07a8a7bca405b1bc82664e1edfbac630cb593dc7ef8c1f7f31cd1bd7771529\": rpc error: code = NotFound desc = could not find container \"dc07a8a7bca405b1bc82664e1edfbac630cb593dc7ef8c1f7f31cd1bd7771529\": container with ID starting with dc07a8a7bca405b1bc82664e1edfbac630cb593dc7ef8c1f7f31cd1bd7771529 not found: ID does not exist" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.736364 4717 scope.go:117] "RemoveContainer" containerID="173cff959f98d46c826066f68723fa137378f31c8ad80de186333bd881fdd54f" Feb 17 15:30:34 crc kubenswrapper[4717]: E0217 15:30:34.736557 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173cff959f98d46c826066f68723fa137378f31c8ad80de186333bd881fdd54f\": container with ID starting with 173cff959f98d46c826066f68723fa137378f31c8ad80de186333bd881fdd54f not found: ID does not exist" containerID="173cff959f98d46c826066f68723fa137378f31c8ad80de186333bd881fdd54f" Feb 17 15:30:34 crc kubenswrapper[4717]: I0217 15:30:34.736578 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173cff959f98d46c826066f68723fa137378f31c8ad80de186333bd881fdd54f"} err="failed to get container status \"173cff959f98d46c826066f68723fa137378f31c8ad80de186333bd881fdd54f\": rpc error: code = NotFound desc = could not find container \"173cff959f98d46c826066f68723fa137378f31c8ad80de186333bd881fdd54f\": container with ID starting with 173cff959f98d46c826066f68723fa137378f31c8ad80de186333bd881fdd54f not found: ID does not exist" Feb 17 15:30:35 crc kubenswrapper[4717]: I0217 15:30:35.863985 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d783fc4f-ae86-4b01-8cdc-b7e40253f429" path="/var/lib/kubelet/pods/d783fc4f-ae86-4b01-8cdc-b7e40253f429/volumes" Feb 17 15:30:51 crc kubenswrapper[4717]: I0217 15:30:51.796241 4717 generic.go:334] "Generic (PLEG): container finished" podID="fff26fba-baaf-4ed4-9c5b-b6dec300d19c" containerID="4d191b45697381bda5cfe783ddef5b77512237b20d9959a6b27316fade303e71" exitCode=0 Feb 17 15:30:51 crc kubenswrapper[4717]: I0217 15:30:51.796388 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" event={"ID":"fff26fba-baaf-4ed4-9c5b-b6dec300d19c","Type":"ContainerDied","Data":"4d191b45697381bda5cfe783ddef5b77512237b20d9959a6b27316fade303e71"} Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.330526 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.483385 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9f77\" (UniqueName: \"kubernetes.io/projected/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-kube-api-access-z9f77\") pod \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.483532 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-inventory\") pod \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.483558 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-libvirt-combined-ca-bundle\") pod \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.483633 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-libvirt-secret-0\") pod \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.483693 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-ssh-key-openstack-edpm-ipam\") pod \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\" (UID: \"fff26fba-baaf-4ed4-9c5b-b6dec300d19c\") " Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.491362 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-kube-api-access-z9f77" (OuterVolumeSpecName: "kube-api-access-z9f77") pod "fff26fba-baaf-4ed4-9c5b-b6dec300d19c" (UID: "fff26fba-baaf-4ed4-9c5b-b6dec300d19c"). InnerVolumeSpecName "kube-api-access-z9f77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.491360 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fff26fba-baaf-4ed4-9c5b-b6dec300d19c" (UID: "fff26fba-baaf-4ed4-9c5b-b6dec300d19c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.522268 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-inventory" (OuterVolumeSpecName: "inventory") pod "fff26fba-baaf-4ed4-9c5b-b6dec300d19c" (UID: "fff26fba-baaf-4ed4-9c5b-b6dec300d19c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.523188 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fff26fba-baaf-4ed4-9c5b-b6dec300d19c" (UID: "fff26fba-baaf-4ed4-9c5b-b6dec300d19c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.525996 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "fff26fba-baaf-4ed4-9c5b-b6dec300d19c" (UID: "fff26fba-baaf-4ed4-9c5b-b6dec300d19c"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.585651 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.585700 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9f77\" (UniqueName: \"kubernetes.io/projected/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-kube-api-access-z9f77\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.585713 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.585725 4717 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.585737 4717 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fff26fba-baaf-4ed4-9c5b-b6dec300d19c-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.825522 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" event={"ID":"fff26fba-baaf-4ed4-9c5b-b6dec300d19c","Type":"ContainerDied","Data":"48081227ed954d50e94726b5578b4bb0f9b25a3ef103b1df7389d1c63140b09a"} Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.826011 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48081227ed954d50e94726b5578b4bb0f9b25a3ef103b1df7389d1c63140b09a" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.825584 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ts54t" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.969937 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9"] Feb 17 15:30:53 crc kubenswrapper[4717]: E0217 15:30:53.970370 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d783fc4f-ae86-4b01-8cdc-b7e40253f429" containerName="extract-utilities" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.970392 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d783fc4f-ae86-4b01-8cdc-b7e40253f429" containerName="extract-utilities" Feb 17 15:30:53 crc kubenswrapper[4717]: E0217 15:30:53.970404 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d783fc4f-ae86-4b01-8cdc-b7e40253f429" containerName="registry-server" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.970413 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d783fc4f-ae86-4b01-8cdc-b7e40253f429" containerName="registry-server" Feb 17 15:30:53 crc kubenswrapper[4717]: E0217 15:30:53.970443 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff26fba-baaf-4ed4-9c5b-b6dec300d19c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.970453 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff26fba-baaf-4ed4-9c5b-b6dec300d19c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 15:30:53 crc kubenswrapper[4717]: E0217 15:30:53.970484 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d783fc4f-ae86-4b01-8cdc-b7e40253f429" containerName="extract-content" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.970492 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d783fc4f-ae86-4b01-8cdc-b7e40253f429" containerName="extract-content" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.970696 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff26fba-baaf-4ed4-9c5b-b6dec300d19c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.970712 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d783fc4f-ae86-4b01-8cdc-b7e40253f429" containerName="registry-server" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.971418 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.974074 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.974516 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.975104 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.975173 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.975245 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.975602 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.975773 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 17 15:30:53 crc kubenswrapper[4717]: I0217 15:30:53.985673 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9"] Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.095741 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.095803 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/377efa15-97db-4618-85fd-1185cefde9a7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.095989 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.096049 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxx5s\" (UniqueName: \"kubernetes.io/projected/377efa15-97db-4618-85fd-1185cefde9a7-kube-api-access-dxx5s\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.096103 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.096180 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.096203 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.096461 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.096519 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.096592 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.096637 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.198206 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/377efa15-97db-4618-85fd-1185cefde9a7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.198538 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.198668 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxx5s\" (UniqueName: \"kubernetes.io/projected/377efa15-97db-4618-85fd-1185cefde9a7-kube-api-access-dxx5s\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.198780 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.198900 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.199009 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.199251 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.199356 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.199437 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/377efa15-97db-4618-85fd-1185cefde9a7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.199549 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.199657 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.199791 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.202593 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.202994 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.203423 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.203822 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.203881 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.204304 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.204741 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.205526 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.206445 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.225291 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxx5s\" (UniqueName: \"kubernetes.io/projected/377efa15-97db-4618-85fd-1185cefde9a7-kube-api-access-dxx5s\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sftg9\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.291552 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:30:54 crc kubenswrapper[4717]: I0217 15:30:54.913165 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9"] Feb 17 15:30:55 crc kubenswrapper[4717]: I0217 15:30:55.862166 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" event={"ID":"377efa15-97db-4618-85fd-1185cefde9a7","Type":"ContainerStarted","Data":"22b2c1cdd0efbdb43ca0bf2501de290d9cf7d6c4896037a8aaa9401a16cf24c1"} Feb 17 15:30:55 crc kubenswrapper[4717]: I0217 15:30:55.862836 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" event={"ID":"377efa15-97db-4618-85fd-1185cefde9a7","Type":"ContainerStarted","Data":"fa2ed2acd9ae957628f360379390088d02e2c7315fe40bce6d71113dabbcbe68"} Feb 17 15:30:55 crc kubenswrapper[4717]: I0217 15:30:55.895423 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" podStartSLOduration=2.338470613 podStartE2EDuration="2.895403317s" podCreationTimestamp="2026-02-17 15:30:53 +0000 UTC" firstStartedPulling="2026-02-17 15:30:54.908497817 +0000 UTC m=+2321.324338293" lastFinishedPulling="2026-02-17 15:30:55.465430511 +0000 UTC m=+2321.881270997" observedRunningTime="2026-02-17 15:30:55.886258418 +0000 UTC m=+2322.302098894" watchObservedRunningTime="2026-02-17 15:30:55.895403317 +0000 UTC m=+2322.311243793" Feb 17 15:31:01 crc kubenswrapper[4717]: I0217 15:31:01.799768 4717 scope.go:117] "RemoveContainer" containerID="984cdc1bdeead0a7608d6df5661d47f99ce597623911b388a43dea3120c99a44" Feb 17 15:31:20 crc kubenswrapper[4717]: I0217 15:31:20.808394 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:31:20 crc kubenswrapper[4717]: I0217 15:31:20.809112 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:31:50 crc kubenswrapper[4717]: I0217 15:31:50.808350 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:31:50 crc kubenswrapper[4717]: I0217 15:31:50.808878 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:32:20 crc kubenswrapper[4717]: I0217 15:32:20.808890 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:32:20 crc kubenswrapper[4717]: I0217 15:32:20.809698 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:32:20 crc kubenswrapper[4717]: I0217 15:32:20.809765 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 15:32:20 crc kubenswrapper[4717]: I0217 15:32:20.810841 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:32:20 crc kubenswrapper[4717]: I0217 15:32:20.810950 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" gracePeriod=600 Feb 17 15:32:20 crc kubenswrapper[4717]: E0217 15:32:20.955878 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:32:21 crc kubenswrapper[4717]: I0217 15:32:21.746684 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" exitCode=0 Feb 17 15:32:21 crc kubenswrapper[4717]: I0217 15:32:21.746751 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071"} Feb 17 15:32:21 crc kubenswrapper[4717]: I0217 15:32:21.746809 4717 scope.go:117] "RemoveContainer" containerID="f2ddaca91157a7f3cebc0d5a4db9348eb8786a6e148854bda086735851b9900e" Feb 17 15:32:21 crc kubenswrapper[4717]: I0217 15:32:21.747593 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:32:21 crc kubenswrapper[4717]: E0217 15:32:21.747947 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:32:36 crc kubenswrapper[4717]: I0217 15:32:36.847169 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:32:36 crc kubenswrapper[4717]: E0217 15:32:36.848215 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:32:49 crc kubenswrapper[4717]: I0217 15:32:49.848142 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:32:49 crc kubenswrapper[4717]: E0217 15:32:49.850914 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:33:01 crc kubenswrapper[4717]: I0217 15:33:01.847322 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:33:01 crc kubenswrapper[4717]: E0217 15:33:01.848150 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:33:16 crc kubenswrapper[4717]: I0217 15:33:16.847659 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:33:16 crc kubenswrapper[4717]: E0217 15:33:16.848523 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:33:21 crc kubenswrapper[4717]: I0217 15:33:21.368408 4717 generic.go:334] "Generic (PLEG): container finished" podID="377efa15-97db-4618-85fd-1185cefde9a7" containerID="22b2c1cdd0efbdb43ca0bf2501de290d9cf7d6c4896037a8aaa9401a16cf24c1" exitCode=0 Feb 17 15:33:21 crc kubenswrapper[4717]: I0217 15:33:21.368519 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" event={"ID":"377efa15-97db-4618-85fd-1185cefde9a7","Type":"ContainerDied","Data":"22b2c1cdd0efbdb43ca0bf2501de290d9cf7d6c4896037a8aaa9401a16cf24c1"} Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.847532 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.973893 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-2\") pod \"377efa15-97db-4618-85fd-1185cefde9a7\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.974002 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-migration-ssh-key-1\") pod \"377efa15-97db-4618-85fd-1185cefde9a7\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.974142 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/377efa15-97db-4618-85fd-1185cefde9a7-nova-extra-config-0\") pod \"377efa15-97db-4618-85fd-1185cefde9a7\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.974191 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-1\") pod \"377efa15-97db-4618-85fd-1185cefde9a7\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.974224 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-migration-ssh-key-0\") pod \"377efa15-97db-4618-85fd-1185cefde9a7\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.974271 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-3\") pod \"377efa15-97db-4618-85fd-1185cefde9a7\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.974357 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-inventory\") pod \"377efa15-97db-4618-85fd-1185cefde9a7\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.974424 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxx5s\" (UniqueName: \"kubernetes.io/projected/377efa15-97db-4618-85fd-1185cefde9a7-kube-api-access-dxx5s\") pod \"377efa15-97db-4618-85fd-1185cefde9a7\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.974479 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-combined-ca-bundle\") pod \"377efa15-97db-4618-85fd-1185cefde9a7\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.974578 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-0\") pod \"377efa15-97db-4618-85fd-1185cefde9a7\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.974634 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-ssh-key-openstack-edpm-ipam\") pod \"377efa15-97db-4618-85fd-1185cefde9a7\" (UID: \"377efa15-97db-4618-85fd-1185cefde9a7\") " Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.983379 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "377efa15-97db-4618-85fd-1185cefde9a7" (UID: "377efa15-97db-4618-85fd-1185cefde9a7"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:33:22 crc kubenswrapper[4717]: I0217 15:33:22.983464 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377efa15-97db-4618-85fd-1185cefde9a7-kube-api-access-dxx5s" (OuterVolumeSpecName: "kube-api-access-dxx5s") pod "377efa15-97db-4618-85fd-1185cefde9a7" (UID: "377efa15-97db-4618-85fd-1185cefde9a7"). InnerVolumeSpecName "kube-api-access-dxx5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.014620 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "377efa15-97db-4618-85fd-1185cefde9a7" (UID: "377efa15-97db-4618-85fd-1185cefde9a7"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.014839 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "377efa15-97db-4618-85fd-1185cefde9a7" (UID: "377efa15-97db-4618-85fd-1185cefde9a7"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.021576 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-inventory" (OuterVolumeSpecName: "inventory") pod "377efa15-97db-4618-85fd-1185cefde9a7" (UID: "377efa15-97db-4618-85fd-1185cefde9a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.025338 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "377efa15-97db-4618-85fd-1185cefde9a7" (UID: "377efa15-97db-4618-85fd-1185cefde9a7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.033356 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377efa15-97db-4618-85fd-1185cefde9a7-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "377efa15-97db-4618-85fd-1185cefde9a7" (UID: "377efa15-97db-4618-85fd-1185cefde9a7"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.037012 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "377efa15-97db-4618-85fd-1185cefde9a7" (UID: "377efa15-97db-4618-85fd-1185cefde9a7"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.040802 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "377efa15-97db-4618-85fd-1185cefde9a7" (UID: "377efa15-97db-4618-85fd-1185cefde9a7"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.044731 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "377efa15-97db-4618-85fd-1185cefde9a7" (UID: "377efa15-97db-4618-85fd-1185cefde9a7"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.056304 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "377efa15-97db-4618-85fd-1185cefde9a7" (UID: "377efa15-97db-4618-85fd-1185cefde9a7"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.077558 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.077600 4717 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.077615 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.077628 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.077672 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxx5s\" (UniqueName: \"kubernetes.io/projected/377efa15-97db-4618-85fd-1185cefde9a7-kube-api-access-dxx5s\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.077703 4717 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.077714 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.077726 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.077738 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.077749 4717 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/377efa15-97db-4618-85fd-1185cefde9a7-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.077762 4717 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/377efa15-97db-4618-85fd-1185cefde9a7-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.398416 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" event={"ID":"377efa15-97db-4618-85fd-1185cefde9a7","Type":"ContainerDied","Data":"fa2ed2acd9ae957628f360379390088d02e2c7315fe40bce6d71113dabbcbe68"} Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.398486 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa2ed2acd9ae957628f360379390088d02e2c7315fe40bce6d71113dabbcbe68" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.398631 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sftg9" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.580355 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv"] Feb 17 15:33:23 crc kubenswrapper[4717]: E0217 15:33:23.581069 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377efa15-97db-4618-85fd-1185cefde9a7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.581116 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="377efa15-97db-4618-85fd-1185cefde9a7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.581359 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="377efa15-97db-4618-85fd-1185cefde9a7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.582069 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.590165 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.590414 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.590589 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.590732 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.591128 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wrk28" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.602373 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv"] Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.690279 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.690358 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.690514 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.690606 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd5gf\" (UniqueName: \"kubernetes.io/projected/92941f8a-938e-41b2-ae92-ea490e7050d9-kube-api-access-fd5gf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.690694 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.690730 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.691052 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.792964 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.793058 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.793108 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.793136 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.793178 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd5gf\" (UniqueName: \"kubernetes.io/projected/92941f8a-938e-41b2-ae92-ea490e7050d9-kube-api-access-fd5gf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.793215 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.793242 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.798160 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.799156 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.801630 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.802461 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.802768 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.803914 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.811068 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd5gf\" (UniqueName: \"kubernetes.io/projected/92941f8a-938e-41b2-ae92-ea490e7050d9-kube-api-access-fd5gf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:23 crc kubenswrapper[4717]: I0217 15:33:23.909765 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:33:24 crc kubenswrapper[4717]: I0217 15:33:24.422154 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv"] Feb 17 15:33:25 crc kubenswrapper[4717]: I0217 15:33:25.416623 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" event={"ID":"92941f8a-938e-41b2-ae92-ea490e7050d9","Type":"ContainerStarted","Data":"3d66d618e3e9b8f0424b7f2cac1275146f1c31ca856ca61a2044277b8b0435e6"} Feb 17 15:33:25 crc kubenswrapper[4717]: I0217 15:33:25.416928 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" event={"ID":"92941f8a-938e-41b2-ae92-ea490e7050d9","Type":"ContainerStarted","Data":"b817fdbb9d80a7fb50b9b833c4b4474f72e6aa317f28b8698c2c268ddf208236"} Feb 17 15:33:25 crc kubenswrapper[4717]: I0217 15:33:25.445931 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" podStartSLOduration=2.016968059 podStartE2EDuration="2.445914069s" podCreationTimestamp="2026-02-17 15:33:23 +0000 UTC" firstStartedPulling="2026-02-17 15:33:24.429359595 +0000 UTC m=+2470.845200071" lastFinishedPulling="2026-02-17 15:33:24.858305565 +0000 UTC m=+2471.274146081" observedRunningTime="2026-02-17 15:33:25.434540207 +0000 UTC m=+2471.850380693" watchObservedRunningTime="2026-02-17 15:33:25.445914069 +0000 UTC m=+2471.861754545" Feb 17 15:33:29 crc kubenswrapper[4717]: I0217 15:33:29.847001 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:33:29 crc kubenswrapper[4717]: E0217 15:33:29.847866 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:33:42 crc kubenswrapper[4717]: I0217 15:33:42.847397 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:33:42 crc kubenswrapper[4717]: E0217 15:33:42.848407 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:33:55 crc kubenswrapper[4717]: I0217 15:33:55.852814 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:33:55 crc kubenswrapper[4717]: E0217 15:33:55.853617 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:34:08 crc kubenswrapper[4717]: I0217 15:34:08.847026 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:34:08 crc kubenswrapper[4717]: E0217 15:34:08.848144 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:34:19 crc kubenswrapper[4717]: I0217 15:34:19.847682 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:34:19 crc kubenswrapper[4717]: E0217 15:34:19.848359 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:34:31 crc kubenswrapper[4717]: I0217 15:34:31.846829 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:34:31 crc kubenswrapper[4717]: E0217 15:34:31.848020 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:34:42 crc kubenswrapper[4717]: I0217 15:34:42.846872 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:34:42 crc kubenswrapper[4717]: E0217 15:34:42.847805 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:34:56 crc kubenswrapper[4717]: I0217 15:34:56.846890 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:34:56 crc kubenswrapper[4717]: E0217 15:34:56.847809 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:35:07 crc kubenswrapper[4717]: I0217 15:35:07.847195 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:35:07 crc kubenswrapper[4717]: E0217 15:35:07.847849 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:35:18 crc kubenswrapper[4717]: I0217 15:35:18.847211 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:35:18 crc kubenswrapper[4717]: E0217 15:35:18.849751 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:35:29 crc kubenswrapper[4717]: I0217 15:35:29.847118 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:35:29 crc kubenswrapper[4717]: E0217 15:35:29.847935 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:35:43 crc kubenswrapper[4717]: I0217 15:35:43.847189 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:35:43 crc kubenswrapper[4717]: E0217 15:35:43.848288 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:35:47 crc kubenswrapper[4717]: I0217 15:35:47.877203 4717 generic.go:334] "Generic (PLEG): container finished" podID="92941f8a-938e-41b2-ae92-ea490e7050d9" containerID="3d66d618e3e9b8f0424b7f2cac1275146f1c31ca856ca61a2044277b8b0435e6" exitCode=0 Feb 17 15:35:47 crc kubenswrapper[4717]: I0217 15:35:47.877418 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" event={"ID":"92941f8a-938e-41b2-ae92-ea490e7050d9","Type":"ContainerDied","Data":"3d66d618e3e9b8f0424b7f2cac1275146f1c31ca856ca61a2044277b8b0435e6"} Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.133980 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nnb9k"] Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.137687 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.151051 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nnb9k"] Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.238662 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-catalog-content\") pod \"community-operators-nnb9k\" (UID: \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\") " pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.239179 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pgdp\" (UniqueName: \"kubernetes.io/projected/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-kube-api-access-7pgdp\") pod \"community-operators-nnb9k\" (UID: \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\") " pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.239480 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-utilities\") pod \"community-operators-nnb9k\" (UID: \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\") " pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.349023 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pgdp\" (UniqueName: \"kubernetes.io/projected/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-kube-api-access-7pgdp\") pod \"community-operators-nnb9k\" (UID: \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\") " pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.349093 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-utilities\") pod \"community-operators-nnb9k\" (UID: \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\") " pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.349300 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-catalog-content\") pod \"community-operators-nnb9k\" (UID: \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\") " pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.349605 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-utilities\") pod \"community-operators-nnb9k\" (UID: \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\") " pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.349698 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-catalog-content\") pod \"community-operators-nnb9k\" (UID: \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\") " pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.373349 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pgdp\" (UniqueName: \"kubernetes.io/projected/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-kube-api-access-7pgdp\") pod \"community-operators-nnb9k\" (UID: \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\") " pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.422228 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.484577 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.552680 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-telemetry-combined-ca-bundle\") pod \"92941f8a-938e-41b2-ae92-ea490e7050d9\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.552771 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-inventory\") pod \"92941f8a-938e-41b2-ae92-ea490e7050d9\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.553118 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd5gf\" (UniqueName: \"kubernetes.io/projected/92941f8a-938e-41b2-ae92-ea490e7050d9-kube-api-access-fd5gf\") pod \"92941f8a-938e-41b2-ae92-ea490e7050d9\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.553150 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-0\") pod \"92941f8a-938e-41b2-ae92-ea490e7050d9\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.553471 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-1\") pod \"92941f8a-938e-41b2-ae92-ea490e7050d9\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.553521 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ssh-key-openstack-edpm-ipam\") pod \"92941f8a-938e-41b2-ae92-ea490e7050d9\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.553716 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-2\") pod \"92941f8a-938e-41b2-ae92-ea490e7050d9\" (UID: \"92941f8a-938e-41b2-ae92-ea490e7050d9\") " Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.559219 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "92941f8a-938e-41b2-ae92-ea490e7050d9" (UID: "92941f8a-938e-41b2-ae92-ea490e7050d9"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.561353 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92941f8a-938e-41b2-ae92-ea490e7050d9-kube-api-access-fd5gf" (OuterVolumeSpecName: "kube-api-access-fd5gf") pod "92941f8a-938e-41b2-ae92-ea490e7050d9" (UID: "92941f8a-938e-41b2-ae92-ea490e7050d9"). InnerVolumeSpecName "kube-api-access-fd5gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.596087 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "92941f8a-938e-41b2-ae92-ea490e7050d9" (UID: "92941f8a-938e-41b2-ae92-ea490e7050d9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.596239 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "92941f8a-938e-41b2-ae92-ea490e7050d9" (UID: "92941f8a-938e-41b2-ae92-ea490e7050d9"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.601645 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "92941f8a-938e-41b2-ae92-ea490e7050d9" (UID: "92941f8a-938e-41b2-ae92-ea490e7050d9"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.607297 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-inventory" (OuterVolumeSpecName: "inventory") pod "92941f8a-938e-41b2-ae92-ea490e7050d9" (UID: "92941f8a-938e-41b2-ae92-ea490e7050d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.614046 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "92941f8a-938e-41b2-ae92-ea490e7050d9" (UID: "92941f8a-938e-41b2-ae92-ea490e7050d9"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.657406 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.657464 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.657477 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.657489 4717 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.657505 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.657538 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd5gf\" (UniqueName: \"kubernetes.io/projected/92941f8a-938e-41b2-ae92-ea490e7050d9-kube-api-access-fd5gf\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.657549 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/92941f8a-938e-41b2-ae92-ea490e7050d9-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.902234 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" event={"ID":"92941f8a-938e-41b2-ae92-ea490e7050d9","Type":"ContainerDied","Data":"b817fdbb9d80a7fb50b9b833c4b4474f72e6aa317f28b8698c2c268ddf208236"} Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.902282 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b817fdbb9d80a7fb50b9b833c4b4474f72e6aa317f28b8698c2c268ddf208236" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.902292 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv" Feb 17 15:35:49 crc kubenswrapper[4717]: I0217 15:35:49.935520 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nnb9k"] Feb 17 15:35:49 crc kubenswrapper[4717]: W0217 15:35:49.943421 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4865bbb_ec84_42e8_a688_8c0fb8eb6b66.slice/crio-f590f904bd37cb429fcefb69817445a084fb082c50180f5059c2c4afb1644d40 WatchSource:0}: Error finding container f590f904bd37cb429fcefb69817445a084fb082c50180f5059c2c4afb1644d40: Status 404 returned error can't find the container with id f590f904bd37cb429fcefb69817445a084fb082c50180f5059c2c4afb1644d40 Feb 17 15:35:50 crc kubenswrapper[4717]: I0217 15:35:50.911929 4717 generic.go:334] "Generic (PLEG): container finished" podID="d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" containerID="9234b8941727c263af9e7294eefdab306d024335e41ca20ce731280645566a18" exitCode=0 Feb 17 15:35:50 crc kubenswrapper[4717]: I0217 15:35:50.912001 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnb9k" event={"ID":"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66","Type":"ContainerDied","Data":"9234b8941727c263af9e7294eefdab306d024335e41ca20ce731280645566a18"} Feb 17 15:35:50 crc kubenswrapper[4717]: I0217 15:35:50.912267 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnb9k" event={"ID":"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66","Type":"ContainerStarted","Data":"f590f904bd37cb429fcefb69817445a084fb082c50180f5059c2c4afb1644d40"} Feb 17 15:35:50 crc kubenswrapper[4717]: I0217 15:35:50.915754 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:35:51 crc kubenswrapper[4717]: I0217 15:35:51.929062 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnb9k" event={"ID":"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66","Type":"ContainerStarted","Data":"52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015"} Feb 17 15:35:52 crc kubenswrapper[4717]: E0217 15:35:52.181898 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4865bbb_ec84_42e8_a688_8c0fb8eb6b66.slice/crio-52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015.scope\": RecentStats: unable to find data in memory cache]" Feb 17 15:35:52 crc kubenswrapper[4717]: I0217 15:35:52.944411 4717 generic.go:334] "Generic (PLEG): container finished" podID="d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" containerID="52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015" exitCode=0 Feb 17 15:35:52 crc kubenswrapper[4717]: I0217 15:35:52.945979 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnb9k" event={"ID":"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66","Type":"ContainerDied","Data":"52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015"} Feb 17 15:35:53 crc kubenswrapper[4717]: I0217 15:35:53.957138 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnb9k" event={"ID":"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66","Type":"ContainerStarted","Data":"dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4"} Feb 17 15:35:53 crc kubenswrapper[4717]: I0217 15:35:53.975822 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nnb9k" podStartSLOduration=2.494782674 podStartE2EDuration="4.975803868s" podCreationTimestamp="2026-02-17 15:35:49 +0000 UTC" firstStartedPulling="2026-02-17 15:35:50.915351261 +0000 UTC m=+2617.331191777" lastFinishedPulling="2026-02-17 15:35:53.396372495 +0000 UTC m=+2619.812212971" observedRunningTime="2026-02-17 15:35:53.9723395 +0000 UTC m=+2620.388179986" watchObservedRunningTime="2026-02-17 15:35:53.975803868 +0000 UTC m=+2620.391644354" Feb 17 15:35:56 crc kubenswrapper[4717]: I0217 15:35:56.847667 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:35:56 crc kubenswrapper[4717]: E0217 15:35:56.848323 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:35:59 crc kubenswrapper[4717]: I0217 15:35:59.485745 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:59 crc kubenswrapper[4717]: I0217 15:35:59.486080 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:35:59 crc kubenswrapper[4717]: I0217 15:35:59.539546 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:36:00 crc kubenswrapper[4717]: I0217 15:36:00.120449 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:36:00 crc kubenswrapper[4717]: I0217 15:36:00.199767 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nnb9k"] Feb 17 15:36:02 crc kubenswrapper[4717]: I0217 15:36:02.048617 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nnb9k" podUID="d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" containerName="registry-server" containerID="cri-o://dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4" gracePeriod=2 Feb 17 15:36:02 crc kubenswrapper[4717]: I0217 15:36:02.537114 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:36:02 crc kubenswrapper[4717]: I0217 15:36:02.609645 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-utilities\") pod \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\" (UID: \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\") " Feb 17 15:36:02 crc kubenswrapper[4717]: I0217 15:36:02.609772 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-catalog-content\") pod \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\" (UID: \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\") " Feb 17 15:36:02 crc kubenswrapper[4717]: I0217 15:36:02.609800 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pgdp\" (UniqueName: \"kubernetes.io/projected/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-kube-api-access-7pgdp\") pod \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\" (UID: \"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66\") " Feb 17 15:36:02 crc kubenswrapper[4717]: I0217 15:36:02.610323 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-utilities" (OuterVolumeSpecName: "utilities") pod "d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" (UID: "d4865bbb-ec84-42e8-a688-8c0fb8eb6b66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:36:02 crc kubenswrapper[4717]: I0217 15:36:02.617457 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-kube-api-access-7pgdp" (OuterVolumeSpecName: "kube-api-access-7pgdp") pod "d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" (UID: "d4865bbb-ec84-42e8-a688-8c0fb8eb6b66"). InnerVolumeSpecName "kube-api-access-7pgdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:36:02 crc kubenswrapper[4717]: I0217 15:36:02.671277 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" (UID: "d4865bbb-ec84-42e8-a688-8c0fb8eb6b66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:36:02 crc kubenswrapper[4717]: I0217 15:36:02.712305 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:02 crc kubenswrapper[4717]: I0217 15:36:02.712350 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:02 crc kubenswrapper[4717]: I0217 15:36:02.712369 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pgdp\" (UniqueName: \"kubernetes.io/projected/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66-kube-api-access-7pgdp\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.065414 4717 generic.go:334] "Generic (PLEG): container finished" podID="d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" containerID="dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4" exitCode=0 Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.065489 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnb9k" event={"ID":"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66","Type":"ContainerDied","Data":"dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4"} Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.065542 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnb9k" Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.065573 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnb9k" event={"ID":"d4865bbb-ec84-42e8-a688-8c0fb8eb6b66","Type":"ContainerDied","Data":"f590f904bd37cb429fcefb69817445a084fb082c50180f5059c2c4afb1644d40"} Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.065609 4717 scope.go:117] "RemoveContainer" containerID="dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4" Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.091127 4717 scope.go:117] "RemoveContainer" containerID="52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015" Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.112440 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nnb9k"] Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.121395 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nnb9k"] Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.131162 4717 scope.go:117] "RemoveContainer" containerID="9234b8941727c263af9e7294eefdab306d024335e41ca20ce731280645566a18" Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.184267 4717 scope.go:117] "RemoveContainer" containerID="dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4" Feb 17 15:36:03 crc kubenswrapper[4717]: E0217 15:36:03.184600 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4\": container with ID starting with dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4 not found: ID does not exist" containerID="dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4" Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.184630 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4"} err="failed to get container status \"dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4\": rpc error: code = NotFound desc = could not find container \"dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4\": container with ID starting with dbb187ca8f04658644c338d27c09c5bec55ef6b9030b898e4908e2c6baf1bab4 not found: ID does not exist" Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.184652 4717 scope.go:117] "RemoveContainer" containerID="52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015" Feb 17 15:36:03 crc kubenswrapper[4717]: E0217 15:36:03.184841 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015\": container with ID starting with 52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015 not found: ID does not exist" containerID="52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015" Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.184867 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015"} err="failed to get container status \"52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015\": rpc error: code = NotFound desc = could not find container \"52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015\": container with ID starting with 52a5012ffa1d083fdf2c25711bf12861d7aef7ff50a72a2f3fb835af6babf015 not found: ID does not exist" Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.184884 4717 scope.go:117] "RemoveContainer" containerID="9234b8941727c263af9e7294eefdab306d024335e41ca20ce731280645566a18" Feb 17 15:36:03 crc kubenswrapper[4717]: E0217 15:36:03.185063 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9234b8941727c263af9e7294eefdab306d024335e41ca20ce731280645566a18\": container with ID starting with 9234b8941727c263af9e7294eefdab306d024335e41ca20ce731280645566a18 not found: ID does not exist" containerID="9234b8941727c263af9e7294eefdab306d024335e41ca20ce731280645566a18" Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.185108 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9234b8941727c263af9e7294eefdab306d024335e41ca20ce731280645566a18"} err="failed to get container status \"9234b8941727c263af9e7294eefdab306d024335e41ca20ce731280645566a18\": rpc error: code = NotFound desc = could not find container \"9234b8941727c263af9e7294eefdab306d024335e41ca20ce731280645566a18\": container with ID starting with 9234b8941727c263af9e7294eefdab306d024335e41ca20ce731280645566a18 not found: ID does not exist" Feb 17 15:36:03 crc kubenswrapper[4717]: I0217 15:36:03.862460 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" path="/var/lib/kubelet/pods/d4865bbb-ec84-42e8-a688-8c0fb8eb6b66/volumes" Feb 17 15:36:07 crc kubenswrapper[4717]: I0217 15:36:07.846938 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:36:07 crc kubenswrapper[4717]: E0217 15:36:07.848051 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:36:19 crc kubenswrapper[4717]: I0217 15:36:19.847058 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:36:19 crc kubenswrapper[4717]: E0217 15:36:19.848265 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:36:32 crc kubenswrapper[4717]: I0217 15:36:32.847838 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:36:32 crc kubenswrapper[4717]: E0217 15:36:32.848927 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.450353 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 15:36:41 crc kubenswrapper[4717]: E0217 15:36:41.451407 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92941f8a-938e-41b2-ae92-ea490e7050d9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.451426 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="92941f8a-938e-41b2-ae92-ea490e7050d9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 15:36:41 crc kubenswrapper[4717]: E0217 15:36:41.451447 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" containerName="extract-content" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.451455 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" containerName="extract-content" Feb 17 15:36:41 crc kubenswrapper[4717]: E0217 15:36:41.451488 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" containerName="registry-server" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.451495 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" containerName="registry-server" Feb 17 15:36:41 crc kubenswrapper[4717]: E0217 15:36:41.451516 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" containerName="extract-utilities" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.451524 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" containerName="extract-utilities" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.451745 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4865bbb-ec84-42e8-a688-8c0fb8eb6b66" containerName="registry-server" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.451765 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="92941f8a-938e-41b2-ae92-ea490e7050d9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.452570 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.457287 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.457564 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.457663 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-2lnjw" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.457729 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.474318 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.544434 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.544505 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bc8b07d-6032-43fa-821d-fa1685427d56-config-data\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.544580 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.544855 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz2p7\" (UniqueName: \"kubernetes.io/projected/6bc8b07d-6032-43fa-821d-fa1685427d56-kube-api-access-jz2p7\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.544940 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.544987 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6bc8b07d-6032-43fa-821d-fa1685427d56-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.545240 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bc8b07d-6032-43fa-821d-fa1685427d56-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.546254 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.546296 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6bc8b07d-6032-43fa-821d-fa1685427d56-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.647807 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.647941 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz2p7\" (UniqueName: \"kubernetes.io/projected/6bc8b07d-6032-43fa-821d-fa1685427d56-kube-api-access-jz2p7\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.647998 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.648051 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6bc8b07d-6032-43fa-821d-fa1685427d56-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.648222 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bc8b07d-6032-43fa-821d-fa1685427d56-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.648262 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.648298 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6bc8b07d-6032-43fa-821d-fa1685427d56-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.648408 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.648448 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bc8b07d-6032-43fa-821d-fa1685427d56-config-data\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.648465 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.649304 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6bc8b07d-6032-43fa-821d-fa1685427d56-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.649608 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6bc8b07d-6032-43fa-821d-fa1685427d56-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.650416 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bc8b07d-6032-43fa-821d-fa1685427d56-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.652595 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bc8b07d-6032-43fa-821d-fa1685427d56-config-data\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.658025 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.658054 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.658785 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.671587 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz2p7\" (UniqueName: \"kubernetes.io/projected/6bc8b07d-6032-43fa-821d-fa1685427d56-kube-api-access-jz2p7\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.700635 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " pod="openstack/tempest-tests-tempest" Feb 17 15:36:41 crc kubenswrapper[4717]: I0217 15:36:41.783069 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 15:36:42 crc kubenswrapper[4717]: I0217 15:36:42.089003 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 15:36:42 crc kubenswrapper[4717]: I0217 15:36:42.530685 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6bc8b07d-6032-43fa-821d-fa1685427d56","Type":"ContainerStarted","Data":"2aa15342fc90321cf3f8ad127cceeddfd36580b5451edef92f44c1355db5efa9"} Feb 17 15:36:44 crc kubenswrapper[4717]: I0217 15:36:44.847449 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:36:44 crc kubenswrapper[4717]: E0217 15:36:44.848336 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:36:59 crc kubenswrapper[4717]: I0217 15:36:59.847236 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:36:59 crc kubenswrapper[4717]: E0217 15:36:59.847981 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:37:10 crc kubenswrapper[4717]: I0217 15:37:10.846431 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:37:10 crc kubenswrapper[4717]: E0217 15:37:10.847200 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:37:12 crc kubenswrapper[4717]: E0217 15:37:12.428903 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 17 15:37:12 crc kubenswrapper[4717]: E0217 15:37:12.429593 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jz2p7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(6bc8b07d-6032-43fa-821d-fa1685427d56): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 15:37:12 crc kubenswrapper[4717]: E0217 15:37:12.430904 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="6bc8b07d-6032-43fa-821d-fa1685427d56" Feb 17 15:37:12 crc kubenswrapper[4717]: E0217 15:37:12.862829 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="6bc8b07d-6032-43fa-821d-fa1685427d56" Feb 17 15:37:24 crc kubenswrapper[4717]: I0217 15:37:24.280892 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 17 15:37:24 crc kubenswrapper[4717]: I0217 15:37:24.846531 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:37:25 crc kubenswrapper[4717]: I0217 15:37:25.997863 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6bc8b07d-6032-43fa-821d-fa1685427d56","Type":"ContainerStarted","Data":"c93e579fc219ac3b634cde83769127c7881ea521e0818922f7bea1f386da9148"} Feb 17 15:37:26 crc kubenswrapper[4717]: I0217 15:37:26.000520 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"f641f01d802d39de025b194b77701d5801ece74386c0c17964f7b20e78780fd4"} Feb 17 15:37:26 crc kubenswrapper[4717]: I0217 15:37:26.039358 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.848849204 podStartE2EDuration="46.039340178s" podCreationTimestamp="2026-02-17 15:36:40 +0000 UTC" firstStartedPulling="2026-02-17 15:36:42.086834566 +0000 UTC m=+2668.502675062" lastFinishedPulling="2026-02-17 15:37:24.27732552 +0000 UTC m=+2710.693166036" observedRunningTime="2026-02-17 15:37:26.023552983 +0000 UTC m=+2712.439393479" watchObservedRunningTime="2026-02-17 15:37:26.039340178 +0000 UTC m=+2712.455180654" Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.183452 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k6m5t"] Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.186383 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.200828 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6m5t"] Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.306043 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckhvw\" (UniqueName: \"kubernetes.io/projected/a4ca686c-63fc-4c46-8442-32cbdcf3732f-kube-api-access-ckhvw\") pod \"redhat-operators-k6m5t\" (UID: \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\") " pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.306156 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ca686c-63fc-4c46-8442-32cbdcf3732f-catalog-content\") pod \"redhat-operators-k6m5t\" (UID: \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\") " pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.306227 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ca686c-63fc-4c46-8442-32cbdcf3732f-utilities\") pod \"redhat-operators-k6m5t\" (UID: \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\") " pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.408647 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ca686c-63fc-4c46-8442-32cbdcf3732f-catalog-content\") pod \"redhat-operators-k6m5t\" (UID: \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\") " pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.408967 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ca686c-63fc-4c46-8442-32cbdcf3732f-utilities\") pod \"redhat-operators-k6m5t\" (UID: \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\") " pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.409180 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ca686c-63fc-4c46-8442-32cbdcf3732f-catalog-content\") pod \"redhat-operators-k6m5t\" (UID: \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\") " pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.409471 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckhvw\" (UniqueName: \"kubernetes.io/projected/a4ca686c-63fc-4c46-8442-32cbdcf3732f-kube-api-access-ckhvw\") pod \"redhat-operators-k6m5t\" (UID: \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\") " pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.409475 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ca686c-63fc-4c46-8442-32cbdcf3732f-utilities\") pod \"redhat-operators-k6m5t\" (UID: \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\") " pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.437705 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckhvw\" (UniqueName: \"kubernetes.io/projected/a4ca686c-63fc-4c46-8442-32cbdcf3732f-kube-api-access-ckhvw\") pod \"redhat-operators-k6m5t\" (UID: \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\") " pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.512721 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:13 crc kubenswrapper[4717]: I0217 15:38:13.963135 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6m5t"] Feb 17 15:38:14 crc kubenswrapper[4717]: I0217 15:38:14.516837 4717 generic.go:334] "Generic (PLEG): container finished" podID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" containerID="10e894d467a628032e0f6e6be07950b8cff0eb70f21d383905455513b1142e1b" exitCode=0 Feb 17 15:38:14 crc kubenswrapper[4717]: I0217 15:38:14.516904 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6m5t" event={"ID":"a4ca686c-63fc-4c46-8442-32cbdcf3732f","Type":"ContainerDied","Data":"10e894d467a628032e0f6e6be07950b8cff0eb70f21d383905455513b1142e1b"} Feb 17 15:38:14 crc kubenswrapper[4717]: I0217 15:38:14.516930 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6m5t" event={"ID":"a4ca686c-63fc-4c46-8442-32cbdcf3732f","Type":"ContainerStarted","Data":"147c38f7490ebc9a0210bc84bdb18ce166c4299b9ae0dba5b7f2f22228f75716"} Feb 17 15:38:15 crc kubenswrapper[4717]: I0217 15:38:15.529484 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6m5t" event={"ID":"a4ca686c-63fc-4c46-8442-32cbdcf3732f","Type":"ContainerStarted","Data":"05d5ba2f94109b7ff5ae09811d7e269c82838e765acd65d6cb6440080cd49ec4"} Feb 17 15:38:16 crc kubenswrapper[4717]: I0217 15:38:16.541243 4717 generic.go:334] "Generic (PLEG): container finished" podID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" containerID="05d5ba2f94109b7ff5ae09811d7e269c82838e765acd65d6cb6440080cd49ec4" exitCode=0 Feb 17 15:38:16 crc kubenswrapper[4717]: I0217 15:38:16.541336 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6m5t" event={"ID":"a4ca686c-63fc-4c46-8442-32cbdcf3732f","Type":"ContainerDied","Data":"05d5ba2f94109b7ff5ae09811d7e269c82838e765acd65d6cb6440080cd49ec4"} Feb 17 15:38:17 crc kubenswrapper[4717]: I0217 15:38:17.562842 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6m5t" event={"ID":"a4ca686c-63fc-4c46-8442-32cbdcf3732f","Type":"ContainerStarted","Data":"745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460"} Feb 17 15:38:17 crc kubenswrapper[4717]: I0217 15:38:17.597332 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k6m5t" podStartSLOduration=2.079654548 podStartE2EDuration="4.597310303s" podCreationTimestamp="2026-02-17 15:38:13 +0000 UTC" firstStartedPulling="2026-02-17 15:38:14.518871017 +0000 UTC m=+2760.934711493" lastFinishedPulling="2026-02-17 15:38:17.036526732 +0000 UTC m=+2763.452367248" observedRunningTime="2026-02-17 15:38:17.596587683 +0000 UTC m=+2764.012428169" watchObservedRunningTime="2026-02-17 15:38:17.597310303 +0000 UTC m=+2764.013150869" Feb 17 15:38:23 crc kubenswrapper[4717]: I0217 15:38:23.513277 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:23 crc kubenswrapper[4717]: I0217 15:38:23.514804 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:24 crc kubenswrapper[4717]: I0217 15:38:24.559283 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k6m5t" podUID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" containerName="registry-server" probeResult="failure" output=< Feb 17 15:38:24 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 17 15:38:24 crc kubenswrapper[4717]: > Feb 17 15:38:33 crc kubenswrapper[4717]: I0217 15:38:33.580215 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:33 crc kubenswrapper[4717]: I0217 15:38:33.644689 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:33 crc kubenswrapper[4717]: I0217 15:38:33.827242 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k6m5t"] Feb 17 15:38:34 crc kubenswrapper[4717]: I0217 15:38:34.741332 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k6m5t" podUID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" containerName="registry-server" containerID="cri-o://745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460" gracePeriod=2 Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.271884 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.363424 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ca686c-63fc-4c46-8442-32cbdcf3732f-utilities\") pod \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\" (UID: \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\") " Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.363502 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckhvw\" (UniqueName: \"kubernetes.io/projected/a4ca686c-63fc-4c46-8442-32cbdcf3732f-kube-api-access-ckhvw\") pod \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\" (UID: \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\") " Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.363591 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ca686c-63fc-4c46-8442-32cbdcf3732f-catalog-content\") pod \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\" (UID: \"a4ca686c-63fc-4c46-8442-32cbdcf3732f\") " Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.364319 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4ca686c-63fc-4c46-8442-32cbdcf3732f-utilities" (OuterVolumeSpecName: "utilities") pod "a4ca686c-63fc-4c46-8442-32cbdcf3732f" (UID: "a4ca686c-63fc-4c46-8442-32cbdcf3732f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.369574 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ca686c-63fc-4c46-8442-32cbdcf3732f-kube-api-access-ckhvw" (OuterVolumeSpecName: "kube-api-access-ckhvw") pod "a4ca686c-63fc-4c46-8442-32cbdcf3732f" (UID: "a4ca686c-63fc-4c46-8442-32cbdcf3732f"). InnerVolumeSpecName "kube-api-access-ckhvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.466874 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ca686c-63fc-4c46-8442-32cbdcf3732f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.467026 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckhvw\" (UniqueName: \"kubernetes.io/projected/a4ca686c-63fc-4c46-8442-32cbdcf3732f-kube-api-access-ckhvw\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.475193 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4ca686c-63fc-4c46-8442-32cbdcf3732f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4ca686c-63fc-4c46-8442-32cbdcf3732f" (UID: "a4ca686c-63fc-4c46-8442-32cbdcf3732f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.569586 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ca686c-63fc-4c46-8442-32cbdcf3732f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.754597 4717 generic.go:334] "Generic (PLEG): container finished" podID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" containerID="745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460" exitCode=0 Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.754635 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6m5t" event={"ID":"a4ca686c-63fc-4c46-8442-32cbdcf3732f","Type":"ContainerDied","Data":"745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460"} Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.754666 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6m5t" event={"ID":"a4ca686c-63fc-4c46-8442-32cbdcf3732f","Type":"ContainerDied","Data":"147c38f7490ebc9a0210bc84bdb18ce166c4299b9ae0dba5b7f2f22228f75716"} Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.754683 4717 scope.go:117] "RemoveContainer" containerID="745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.754681 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6m5t" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.796550 4717 scope.go:117] "RemoveContainer" containerID="05d5ba2f94109b7ff5ae09811d7e269c82838e765acd65d6cb6440080cd49ec4" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.805111 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k6m5t"] Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.812492 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k6m5t"] Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.833075 4717 scope.go:117] "RemoveContainer" containerID="10e894d467a628032e0f6e6be07950b8cff0eb70f21d383905455513b1142e1b" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.862548 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" path="/var/lib/kubelet/pods/a4ca686c-63fc-4c46-8442-32cbdcf3732f/volumes" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.867364 4717 scope.go:117] "RemoveContainer" containerID="745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460" Feb 17 15:38:35 crc kubenswrapper[4717]: E0217 15:38:35.867788 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460\": container with ID starting with 745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460 not found: ID does not exist" containerID="745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.867834 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460"} err="failed to get container status \"745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460\": rpc error: code = NotFound desc = could not find container \"745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460\": container with ID starting with 745ed7032bc8edbc166711681d68591fddb6ea4ae4aa49276eb6432280d84460 not found: ID does not exist" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.867859 4717 scope.go:117] "RemoveContainer" containerID="05d5ba2f94109b7ff5ae09811d7e269c82838e765acd65d6cb6440080cd49ec4" Feb 17 15:38:35 crc kubenswrapper[4717]: E0217 15:38:35.868162 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05d5ba2f94109b7ff5ae09811d7e269c82838e765acd65d6cb6440080cd49ec4\": container with ID starting with 05d5ba2f94109b7ff5ae09811d7e269c82838e765acd65d6cb6440080cd49ec4 not found: ID does not exist" containerID="05d5ba2f94109b7ff5ae09811d7e269c82838e765acd65d6cb6440080cd49ec4" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.868189 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d5ba2f94109b7ff5ae09811d7e269c82838e765acd65d6cb6440080cd49ec4"} err="failed to get container status \"05d5ba2f94109b7ff5ae09811d7e269c82838e765acd65d6cb6440080cd49ec4\": rpc error: code = NotFound desc = could not find container \"05d5ba2f94109b7ff5ae09811d7e269c82838e765acd65d6cb6440080cd49ec4\": container with ID starting with 05d5ba2f94109b7ff5ae09811d7e269c82838e765acd65d6cb6440080cd49ec4 not found: ID does not exist" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.868207 4717 scope.go:117] "RemoveContainer" containerID="10e894d467a628032e0f6e6be07950b8cff0eb70f21d383905455513b1142e1b" Feb 17 15:38:35 crc kubenswrapper[4717]: E0217 15:38:35.868505 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e894d467a628032e0f6e6be07950b8cff0eb70f21d383905455513b1142e1b\": container with ID starting with 10e894d467a628032e0f6e6be07950b8cff0eb70f21d383905455513b1142e1b not found: ID does not exist" containerID="10e894d467a628032e0f6e6be07950b8cff0eb70f21d383905455513b1142e1b" Feb 17 15:38:35 crc kubenswrapper[4717]: I0217 15:38:35.868555 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e894d467a628032e0f6e6be07950b8cff0eb70f21d383905455513b1142e1b"} err="failed to get container status \"10e894d467a628032e0f6e6be07950b8cff0eb70f21d383905455513b1142e1b\": rpc error: code = NotFound desc = could not find container \"10e894d467a628032e0f6e6be07950b8cff0eb70f21d383905455513b1142e1b\": container with ID starting with 10e894d467a628032e0f6e6be07950b8cff0eb70f21d383905455513b1142e1b not found: ID does not exist" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.174294 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rrfkd"] Feb 17 15:39:00 crc kubenswrapper[4717]: E0217 15:39:00.175849 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" containerName="extract-content" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.175890 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" containerName="extract-content" Feb 17 15:39:00 crc kubenswrapper[4717]: E0217 15:39:00.175989 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" containerName="extract-utilities" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.176018 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" containerName="extract-utilities" Feb 17 15:39:00 crc kubenswrapper[4717]: E0217 15:39:00.176072 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" containerName="registry-server" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.176133 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" containerName="registry-server" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.176615 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ca686c-63fc-4c46-8442-32cbdcf3732f" containerName="registry-server" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.180222 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.194410 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrfkd"] Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.330698 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl9q2\" (UniqueName: \"kubernetes.io/projected/d827aaff-71ad-4267-badc-cfd4e725225e-kube-api-access-xl9q2\") pod \"redhat-marketplace-rrfkd\" (UID: \"d827aaff-71ad-4267-badc-cfd4e725225e\") " pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.330878 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d827aaff-71ad-4267-badc-cfd4e725225e-catalog-content\") pod \"redhat-marketplace-rrfkd\" (UID: \"d827aaff-71ad-4267-badc-cfd4e725225e\") " pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.330929 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d827aaff-71ad-4267-badc-cfd4e725225e-utilities\") pod \"redhat-marketplace-rrfkd\" (UID: \"d827aaff-71ad-4267-badc-cfd4e725225e\") " pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.433291 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl9q2\" (UniqueName: \"kubernetes.io/projected/d827aaff-71ad-4267-badc-cfd4e725225e-kube-api-access-xl9q2\") pod \"redhat-marketplace-rrfkd\" (UID: \"d827aaff-71ad-4267-badc-cfd4e725225e\") " pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.433432 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d827aaff-71ad-4267-badc-cfd4e725225e-catalog-content\") pod \"redhat-marketplace-rrfkd\" (UID: \"d827aaff-71ad-4267-badc-cfd4e725225e\") " pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.433485 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d827aaff-71ad-4267-badc-cfd4e725225e-utilities\") pod \"redhat-marketplace-rrfkd\" (UID: \"d827aaff-71ad-4267-badc-cfd4e725225e\") " pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.433926 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d827aaff-71ad-4267-badc-cfd4e725225e-catalog-content\") pod \"redhat-marketplace-rrfkd\" (UID: \"d827aaff-71ad-4267-badc-cfd4e725225e\") " pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.434001 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d827aaff-71ad-4267-badc-cfd4e725225e-utilities\") pod \"redhat-marketplace-rrfkd\" (UID: \"d827aaff-71ad-4267-badc-cfd4e725225e\") " pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.469874 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl9q2\" (UniqueName: \"kubernetes.io/projected/d827aaff-71ad-4267-badc-cfd4e725225e-kube-api-access-xl9q2\") pod \"redhat-marketplace-rrfkd\" (UID: \"d827aaff-71ad-4267-badc-cfd4e725225e\") " pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:00 crc kubenswrapper[4717]: I0217 15:39:00.526593 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:01 crc kubenswrapper[4717]: I0217 15:39:01.008270 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrfkd"] Feb 17 15:39:01 crc kubenswrapper[4717]: W0217 15:39:01.014125 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd827aaff_71ad_4267_badc_cfd4e725225e.slice/crio-5518ea1adb8049fdc78f8681f7337776b272d5a93070b3ca9a5a7ab237759b18 WatchSource:0}: Error finding container 5518ea1adb8049fdc78f8681f7337776b272d5a93070b3ca9a5a7ab237759b18: Status 404 returned error can't find the container with id 5518ea1adb8049fdc78f8681f7337776b272d5a93070b3ca9a5a7ab237759b18 Feb 17 15:39:02 crc kubenswrapper[4717]: I0217 15:39:02.009277 4717 generic.go:334] "Generic (PLEG): container finished" podID="d827aaff-71ad-4267-badc-cfd4e725225e" containerID="460e7acd66f0961967f13ff02cc1fd260aea18a45179e40f763a37962ca116ea" exitCode=0 Feb 17 15:39:02 crc kubenswrapper[4717]: I0217 15:39:02.010473 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrfkd" event={"ID":"d827aaff-71ad-4267-badc-cfd4e725225e","Type":"ContainerDied","Data":"460e7acd66f0961967f13ff02cc1fd260aea18a45179e40f763a37962ca116ea"} Feb 17 15:39:02 crc kubenswrapper[4717]: I0217 15:39:02.010657 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrfkd" event={"ID":"d827aaff-71ad-4267-badc-cfd4e725225e","Type":"ContainerStarted","Data":"5518ea1adb8049fdc78f8681f7337776b272d5a93070b3ca9a5a7ab237759b18"} Feb 17 15:39:03 crc kubenswrapper[4717]: I0217 15:39:03.020329 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrfkd" event={"ID":"d827aaff-71ad-4267-badc-cfd4e725225e","Type":"ContainerStarted","Data":"5d62d7d14f510cb64a44df294d682ed28e01dd046a8e9e53bda32e8173c105f9"} Feb 17 15:39:04 crc kubenswrapper[4717]: I0217 15:39:04.032338 4717 generic.go:334] "Generic (PLEG): container finished" podID="d827aaff-71ad-4267-badc-cfd4e725225e" containerID="5d62d7d14f510cb64a44df294d682ed28e01dd046a8e9e53bda32e8173c105f9" exitCode=0 Feb 17 15:39:04 crc kubenswrapper[4717]: I0217 15:39:04.032408 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrfkd" event={"ID":"d827aaff-71ad-4267-badc-cfd4e725225e","Type":"ContainerDied","Data":"5d62d7d14f510cb64a44df294d682ed28e01dd046a8e9e53bda32e8173c105f9"} Feb 17 15:39:05 crc kubenswrapper[4717]: I0217 15:39:05.043811 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrfkd" event={"ID":"d827aaff-71ad-4267-badc-cfd4e725225e","Type":"ContainerStarted","Data":"17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd"} Feb 17 15:39:05 crc kubenswrapper[4717]: I0217 15:39:05.076294 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rrfkd" podStartSLOduration=2.659520379 podStartE2EDuration="5.076276399s" podCreationTimestamp="2026-02-17 15:39:00 +0000 UTC" firstStartedPulling="2026-02-17 15:39:02.012798006 +0000 UTC m=+2808.428638482" lastFinishedPulling="2026-02-17 15:39:04.429554016 +0000 UTC m=+2810.845394502" observedRunningTime="2026-02-17 15:39:05.073918433 +0000 UTC m=+2811.489758919" watchObservedRunningTime="2026-02-17 15:39:05.076276399 +0000 UTC m=+2811.492116875" Feb 17 15:39:10 crc kubenswrapper[4717]: I0217 15:39:10.526758 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:10 crc kubenswrapper[4717]: I0217 15:39:10.527312 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:10 crc kubenswrapper[4717]: I0217 15:39:10.576410 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:11 crc kubenswrapper[4717]: I0217 15:39:11.168249 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:13 crc kubenswrapper[4717]: I0217 15:39:13.001407 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrfkd"] Feb 17 15:39:13 crc kubenswrapper[4717]: I0217 15:39:13.119895 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rrfkd" podUID="d827aaff-71ad-4267-badc-cfd4e725225e" containerName="registry-server" containerID="cri-o://17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd" gracePeriod=2 Feb 17 15:39:13 crc kubenswrapper[4717]: I0217 15:39:13.615356 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:13 crc kubenswrapper[4717]: I0217 15:39:13.808808 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d827aaff-71ad-4267-badc-cfd4e725225e-utilities\") pod \"d827aaff-71ad-4267-badc-cfd4e725225e\" (UID: \"d827aaff-71ad-4267-badc-cfd4e725225e\") " Feb 17 15:39:13 crc kubenswrapper[4717]: I0217 15:39:13.808916 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d827aaff-71ad-4267-badc-cfd4e725225e-catalog-content\") pod \"d827aaff-71ad-4267-badc-cfd4e725225e\" (UID: \"d827aaff-71ad-4267-badc-cfd4e725225e\") " Feb 17 15:39:13 crc kubenswrapper[4717]: I0217 15:39:13.809150 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl9q2\" (UniqueName: \"kubernetes.io/projected/d827aaff-71ad-4267-badc-cfd4e725225e-kube-api-access-xl9q2\") pod \"d827aaff-71ad-4267-badc-cfd4e725225e\" (UID: \"d827aaff-71ad-4267-badc-cfd4e725225e\") " Feb 17 15:39:13 crc kubenswrapper[4717]: I0217 15:39:13.813613 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d827aaff-71ad-4267-badc-cfd4e725225e-utilities" (OuterVolumeSpecName: "utilities") pod "d827aaff-71ad-4267-badc-cfd4e725225e" (UID: "d827aaff-71ad-4267-badc-cfd4e725225e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:39:13 crc kubenswrapper[4717]: I0217 15:39:13.820381 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d827aaff-71ad-4267-badc-cfd4e725225e-kube-api-access-xl9q2" (OuterVolumeSpecName: "kube-api-access-xl9q2") pod "d827aaff-71ad-4267-badc-cfd4e725225e" (UID: "d827aaff-71ad-4267-badc-cfd4e725225e"). InnerVolumeSpecName "kube-api-access-xl9q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:39:13 crc kubenswrapper[4717]: I0217 15:39:13.846697 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d827aaff-71ad-4267-badc-cfd4e725225e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d827aaff-71ad-4267-badc-cfd4e725225e" (UID: "d827aaff-71ad-4267-badc-cfd4e725225e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:39:13 crc kubenswrapper[4717]: I0217 15:39:13.912245 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d827aaff-71ad-4267-badc-cfd4e725225e-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:39:13 crc kubenswrapper[4717]: I0217 15:39:13.912275 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d827aaff-71ad-4267-badc-cfd4e725225e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:39:13 crc kubenswrapper[4717]: I0217 15:39:13.912287 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl9q2\" (UniqueName: \"kubernetes.io/projected/d827aaff-71ad-4267-badc-cfd4e725225e-kube-api-access-xl9q2\") on node \"crc\" DevicePath \"\"" Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.137859 4717 generic.go:334] "Generic (PLEG): container finished" podID="d827aaff-71ad-4267-badc-cfd4e725225e" containerID="17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd" exitCode=0 Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.137942 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrfkd" event={"ID":"d827aaff-71ad-4267-badc-cfd4e725225e","Type":"ContainerDied","Data":"17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd"} Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.137980 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrfkd" Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.138004 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrfkd" event={"ID":"d827aaff-71ad-4267-badc-cfd4e725225e","Type":"ContainerDied","Data":"5518ea1adb8049fdc78f8681f7337776b272d5a93070b3ca9a5a7ab237759b18"} Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.138039 4717 scope.go:117] "RemoveContainer" containerID="17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd" Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.174270 4717 scope.go:117] "RemoveContainer" containerID="5d62d7d14f510cb64a44df294d682ed28e01dd046a8e9e53bda32e8173c105f9" Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.187579 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrfkd"] Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.201289 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrfkd"] Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.213846 4717 scope.go:117] "RemoveContainer" containerID="460e7acd66f0961967f13ff02cc1fd260aea18a45179e40f763a37962ca116ea" Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.268379 4717 scope.go:117] "RemoveContainer" containerID="17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd" Feb 17 15:39:14 crc kubenswrapper[4717]: E0217 15:39:14.268834 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd\": container with ID starting with 17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd not found: ID does not exist" containerID="17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd" Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.268866 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd"} err="failed to get container status \"17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd\": rpc error: code = NotFound desc = could not find container \"17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd\": container with ID starting with 17a1ed5ce79eb84853e1e44a0e1732b3af629c79a04ac90be33b1aed85ba1fbd not found: ID does not exist" Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.268887 4717 scope.go:117] "RemoveContainer" containerID="5d62d7d14f510cb64a44df294d682ed28e01dd046a8e9e53bda32e8173c105f9" Feb 17 15:39:14 crc kubenswrapper[4717]: E0217 15:39:14.269431 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d62d7d14f510cb64a44df294d682ed28e01dd046a8e9e53bda32e8173c105f9\": container with ID starting with 5d62d7d14f510cb64a44df294d682ed28e01dd046a8e9e53bda32e8173c105f9 not found: ID does not exist" containerID="5d62d7d14f510cb64a44df294d682ed28e01dd046a8e9e53bda32e8173c105f9" Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.269500 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d62d7d14f510cb64a44df294d682ed28e01dd046a8e9e53bda32e8173c105f9"} err="failed to get container status \"5d62d7d14f510cb64a44df294d682ed28e01dd046a8e9e53bda32e8173c105f9\": rpc error: code = NotFound desc = could not find container \"5d62d7d14f510cb64a44df294d682ed28e01dd046a8e9e53bda32e8173c105f9\": container with ID starting with 5d62d7d14f510cb64a44df294d682ed28e01dd046a8e9e53bda32e8173c105f9 not found: ID does not exist" Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.269544 4717 scope.go:117] "RemoveContainer" containerID="460e7acd66f0961967f13ff02cc1fd260aea18a45179e40f763a37962ca116ea" Feb 17 15:39:14 crc kubenswrapper[4717]: E0217 15:39:14.269917 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"460e7acd66f0961967f13ff02cc1fd260aea18a45179e40f763a37962ca116ea\": container with ID starting with 460e7acd66f0961967f13ff02cc1fd260aea18a45179e40f763a37962ca116ea not found: ID does not exist" containerID="460e7acd66f0961967f13ff02cc1fd260aea18a45179e40f763a37962ca116ea" Feb 17 15:39:14 crc kubenswrapper[4717]: I0217 15:39:14.269966 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"460e7acd66f0961967f13ff02cc1fd260aea18a45179e40f763a37962ca116ea"} err="failed to get container status \"460e7acd66f0961967f13ff02cc1fd260aea18a45179e40f763a37962ca116ea\": rpc error: code = NotFound desc = could not find container \"460e7acd66f0961967f13ff02cc1fd260aea18a45179e40f763a37962ca116ea\": container with ID starting with 460e7acd66f0961967f13ff02cc1fd260aea18a45179e40f763a37962ca116ea not found: ID does not exist" Feb 17 15:39:15 crc kubenswrapper[4717]: I0217 15:39:15.866427 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d827aaff-71ad-4267-badc-cfd4e725225e" path="/var/lib/kubelet/pods/d827aaff-71ad-4267-badc-cfd4e725225e/volumes" Feb 17 15:39:50 crc kubenswrapper[4717]: I0217 15:39:50.808424 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:39:50 crc kubenswrapper[4717]: I0217 15:39:50.808899 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:40:20 crc kubenswrapper[4717]: I0217 15:40:20.807914 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:40:20 crc kubenswrapper[4717]: I0217 15:40:20.808406 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:40:50 crc kubenswrapper[4717]: I0217 15:40:50.808976 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:40:50 crc kubenswrapper[4717]: I0217 15:40:50.810154 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:40:50 crc kubenswrapper[4717]: I0217 15:40:50.810243 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 15:40:50 crc kubenswrapper[4717]: I0217 15:40:50.811631 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f641f01d802d39de025b194b77701d5801ece74386c0c17964f7b20e78780fd4"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:40:50 crc kubenswrapper[4717]: I0217 15:40:50.811723 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://f641f01d802d39de025b194b77701d5801ece74386c0c17964f7b20e78780fd4" gracePeriod=600 Feb 17 15:40:51 crc kubenswrapper[4717]: I0217 15:40:51.206470 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="f641f01d802d39de025b194b77701d5801ece74386c0c17964f7b20e78780fd4" exitCode=0 Feb 17 15:40:51 crc kubenswrapper[4717]: I0217 15:40:51.206553 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"f641f01d802d39de025b194b77701d5801ece74386c0c17964f7b20e78780fd4"} Feb 17 15:40:51 crc kubenswrapper[4717]: I0217 15:40:51.206811 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd"} Feb 17 15:40:51 crc kubenswrapper[4717]: I0217 15:40:51.206833 4717 scope.go:117] "RemoveContainer" containerID="7980a3e5ea0a20a733fec739869def8bf7755e78c21adf14df5835c204a94071" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.388792 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nm4nn"] Feb 17 15:41:15 crc kubenswrapper[4717]: E0217 15:41:15.389732 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d827aaff-71ad-4267-badc-cfd4e725225e" containerName="extract-utilities" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.389744 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d827aaff-71ad-4267-badc-cfd4e725225e" containerName="extract-utilities" Feb 17 15:41:15 crc kubenswrapper[4717]: E0217 15:41:15.389761 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d827aaff-71ad-4267-badc-cfd4e725225e" containerName="extract-content" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.389768 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d827aaff-71ad-4267-badc-cfd4e725225e" containerName="extract-content" Feb 17 15:41:15 crc kubenswrapper[4717]: E0217 15:41:15.389781 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d827aaff-71ad-4267-badc-cfd4e725225e" containerName="registry-server" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.389787 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d827aaff-71ad-4267-badc-cfd4e725225e" containerName="registry-server" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.389984 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d827aaff-71ad-4267-badc-cfd4e725225e" containerName="registry-server" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.391733 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.408920 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nm4nn"] Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.490970 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-626z8\" (UniqueName: \"kubernetes.io/projected/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-kube-api-access-626z8\") pod \"certified-operators-nm4nn\" (UID: \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\") " pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.491035 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-utilities\") pod \"certified-operators-nm4nn\" (UID: \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\") " pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.491201 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-catalog-content\") pod \"certified-operators-nm4nn\" (UID: \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\") " pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.593863 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-626z8\" (UniqueName: \"kubernetes.io/projected/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-kube-api-access-626z8\") pod \"certified-operators-nm4nn\" (UID: \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\") " pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.594233 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-utilities\") pod \"certified-operators-nm4nn\" (UID: \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\") " pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.594275 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-catalog-content\") pod \"certified-operators-nm4nn\" (UID: \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\") " pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.594755 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-utilities\") pod \"certified-operators-nm4nn\" (UID: \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\") " pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.594836 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-catalog-content\") pod \"certified-operators-nm4nn\" (UID: \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\") " pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.616268 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-626z8\" (UniqueName: \"kubernetes.io/projected/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-kube-api-access-626z8\") pod \"certified-operators-nm4nn\" (UID: \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\") " pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:15 crc kubenswrapper[4717]: I0217 15:41:15.723617 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:16 crc kubenswrapper[4717]: I0217 15:41:16.240073 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nm4nn"] Feb 17 15:41:16 crc kubenswrapper[4717]: I0217 15:41:16.506918 4717 generic.go:334] "Generic (PLEG): container finished" podID="e93a2340-3d25-40e9-bbf2-95fb6dc812ee" containerID="8919f3ed582381485afff21f292ed5b01b0e6683cd74612301839d9adf30bbee" exitCode=0 Feb 17 15:41:16 crc kubenswrapper[4717]: I0217 15:41:16.507120 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm4nn" event={"ID":"e93a2340-3d25-40e9-bbf2-95fb6dc812ee","Type":"ContainerDied","Data":"8919f3ed582381485afff21f292ed5b01b0e6683cd74612301839d9adf30bbee"} Feb 17 15:41:16 crc kubenswrapper[4717]: I0217 15:41:16.507368 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm4nn" event={"ID":"e93a2340-3d25-40e9-bbf2-95fb6dc812ee","Type":"ContainerStarted","Data":"ddb7c92d28a903d581b21c4b0c7ce27183d5b7d756e1ef4e8510c8534d30d9b0"} Feb 17 15:41:16 crc kubenswrapper[4717]: I0217 15:41:16.509485 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:41:17 crc kubenswrapper[4717]: I0217 15:41:17.516415 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm4nn" event={"ID":"e93a2340-3d25-40e9-bbf2-95fb6dc812ee","Type":"ContainerStarted","Data":"9c7af36108072c22af36356719aa0a46d22042475ebd8abdc3a34c92d5f1df87"} Feb 17 15:41:18 crc kubenswrapper[4717]: I0217 15:41:18.558825 4717 generic.go:334] "Generic (PLEG): container finished" podID="e93a2340-3d25-40e9-bbf2-95fb6dc812ee" containerID="9c7af36108072c22af36356719aa0a46d22042475ebd8abdc3a34c92d5f1df87" exitCode=0 Feb 17 15:41:18 crc kubenswrapper[4717]: I0217 15:41:18.558918 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm4nn" event={"ID":"e93a2340-3d25-40e9-bbf2-95fb6dc812ee","Type":"ContainerDied","Data":"9c7af36108072c22af36356719aa0a46d22042475ebd8abdc3a34c92d5f1df87"} Feb 17 15:41:19 crc kubenswrapper[4717]: I0217 15:41:19.574030 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm4nn" event={"ID":"e93a2340-3d25-40e9-bbf2-95fb6dc812ee","Type":"ContainerStarted","Data":"fb0f59675ee8178dd85457a0e1870d6da66e88ac887bb04724ca563939037a85"} Feb 17 15:41:19 crc kubenswrapper[4717]: I0217 15:41:19.602195 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nm4nn" podStartSLOduration=1.9076505400000001 podStartE2EDuration="4.602172875s" podCreationTimestamp="2026-02-17 15:41:15 +0000 UTC" firstStartedPulling="2026-02-17 15:41:16.509287684 +0000 UTC m=+2942.925128160" lastFinishedPulling="2026-02-17 15:41:19.203810019 +0000 UTC m=+2945.619650495" observedRunningTime="2026-02-17 15:41:19.601524126 +0000 UTC m=+2946.017364652" watchObservedRunningTime="2026-02-17 15:41:19.602172875 +0000 UTC m=+2946.018013361" Feb 17 15:41:25 crc kubenswrapper[4717]: I0217 15:41:25.724696 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:25 crc kubenswrapper[4717]: I0217 15:41:25.725500 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:25 crc kubenswrapper[4717]: I0217 15:41:25.786810 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:26 crc kubenswrapper[4717]: I0217 15:41:26.731607 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:26 crc kubenswrapper[4717]: I0217 15:41:26.797791 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nm4nn"] Feb 17 15:41:28 crc kubenswrapper[4717]: I0217 15:41:28.675399 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nm4nn" podUID="e93a2340-3d25-40e9-bbf2-95fb6dc812ee" containerName="registry-server" containerID="cri-o://fb0f59675ee8178dd85457a0e1870d6da66e88ac887bb04724ca563939037a85" gracePeriod=2 Feb 17 15:41:29 crc kubenswrapper[4717]: I0217 15:41:29.688781 4717 generic.go:334] "Generic (PLEG): container finished" podID="e93a2340-3d25-40e9-bbf2-95fb6dc812ee" containerID="fb0f59675ee8178dd85457a0e1870d6da66e88ac887bb04724ca563939037a85" exitCode=0 Feb 17 15:41:29 crc kubenswrapper[4717]: I0217 15:41:29.689241 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm4nn" event={"ID":"e93a2340-3d25-40e9-bbf2-95fb6dc812ee","Type":"ContainerDied","Data":"fb0f59675ee8178dd85457a0e1870d6da66e88ac887bb04724ca563939037a85"} Feb 17 15:41:29 crc kubenswrapper[4717]: I0217 15:41:29.839997 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:29 crc kubenswrapper[4717]: I0217 15:41:29.986744 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-626z8\" (UniqueName: \"kubernetes.io/projected/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-kube-api-access-626z8\") pod \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\" (UID: \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\") " Feb 17 15:41:29 crc kubenswrapper[4717]: I0217 15:41:29.986919 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-catalog-content\") pod \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\" (UID: \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\") " Feb 17 15:41:29 crc kubenswrapper[4717]: I0217 15:41:29.987017 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-utilities\") pod \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\" (UID: \"e93a2340-3d25-40e9-bbf2-95fb6dc812ee\") " Feb 17 15:41:29 crc kubenswrapper[4717]: I0217 15:41:29.987973 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-utilities" (OuterVolumeSpecName: "utilities") pod "e93a2340-3d25-40e9-bbf2-95fb6dc812ee" (UID: "e93a2340-3d25-40e9-bbf2-95fb6dc812ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:41:29 crc kubenswrapper[4717]: I0217 15:41:29.995306 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-kube-api-access-626z8" (OuterVolumeSpecName: "kube-api-access-626z8") pod "e93a2340-3d25-40e9-bbf2-95fb6dc812ee" (UID: "e93a2340-3d25-40e9-bbf2-95fb6dc812ee"). InnerVolumeSpecName "kube-api-access-626z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:41:30 crc kubenswrapper[4717]: I0217 15:41:30.051269 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e93a2340-3d25-40e9-bbf2-95fb6dc812ee" (UID: "e93a2340-3d25-40e9-bbf2-95fb6dc812ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:41:30 crc kubenswrapper[4717]: I0217 15:41:30.089442 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:41:30 crc kubenswrapper[4717]: I0217 15:41:30.089470 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:41:30 crc kubenswrapper[4717]: I0217 15:41:30.089480 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-626z8\" (UniqueName: \"kubernetes.io/projected/e93a2340-3d25-40e9-bbf2-95fb6dc812ee-kube-api-access-626z8\") on node \"crc\" DevicePath \"\"" Feb 17 15:41:30 crc kubenswrapper[4717]: I0217 15:41:30.703957 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm4nn" event={"ID":"e93a2340-3d25-40e9-bbf2-95fb6dc812ee","Type":"ContainerDied","Data":"ddb7c92d28a903d581b21c4b0c7ce27183d5b7d756e1ef4e8510c8534d30d9b0"} Feb 17 15:41:30 crc kubenswrapper[4717]: I0217 15:41:30.704049 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nm4nn" Feb 17 15:41:30 crc kubenswrapper[4717]: I0217 15:41:30.704444 4717 scope.go:117] "RemoveContainer" containerID="fb0f59675ee8178dd85457a0e1870d6da66e88ac887bb04724ca563939037a85" Feb 17 15:41:30 crc kubenswrapper[4717]: I0217 15:41:30.736200 4717 scope.go:117] "RemoveContainer" containerID="9c7af36108072c22af36356719aa0a46d22042475ebd8abdc3a34c92d5f1df87" Feb 17 15:41:30 crc kubenswrapper[4717]: I0217 15:41:30.763448 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nm4nn"] Feb 17 15:41:30 crc kubenswrapper[4717]: I0217 15:41:30.776150 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nm4nn"] Feb 17 15:41:30 crc kubenswrapper[4717]: I0217 15:41:30.776271 4717 scope.go:117] "RemoveContainer" containerID="8919f3ed582381485afff21f292ed5b01b0e6683cd74612301839d9adf30bbee" Feb 17 15:41:31 crc kubenswrapper[4717]: I0217 15:41:31.870229 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93a2340-3d25-40e9-bbf2-95fb6dc812ee" path="/var/lib/kubelet/pods/e93a2340-3d25-40e9-bbf2-95fb6dc812ee/volumes" Feb 17 15:43:20 crc kubenswrapper[4717]: I0217 15:43:20.809049 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:43:20 crc kubenswrapper[4717]: I0217 15:43:20.809801 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:43:50 crc kubenswrapper[4717]: I0217 15:43:50.808457 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:43:50 crc kubenswrapper[4717]: I0217 15:43:50.809095 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:44:20 crc kubenswrapper[4717]: I0217 15:44:20.808322 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:44:20 crc kubenswrapper[4717]: I0217 15:44:20.808931 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:44:20 crc kubenswrapper[4717]: I0217 15:44:20.808995 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 15:44:20 crc kubenswrapper[4717]: I0217 15:44:20.809929 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:44:20 crc kubenswrapper[4717]: I0217 15:44:20.810018 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" gracePeriod=600 Feb 17 15:44:20 crc kubenswrapper[4717]: E0217 15:44:20.937692 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:44:21 crc kubenswrapper[4717]: I0217 15:44:21.663952 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" exitCode=0 Feb 17 15:44:21 crc kubenswrapper[4717]: I0217 15:44:21.664018 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd"} Feb 17 15:44:21 crc kubenswrapper[4717]: I0217 15:44:21.664098 4717 scope.go:117] "RemoveContainer" containerID="f641f01d802d39de025b194b77701d5801ece74386c0c17964f7b20e78780fd4" Feb 17 15:44:21 crc kubenswrapper[4717]: I0217 15:44:21.665112 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:44:21 crc kubenswrapper[4717]: E0217 15:44:21.665689 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:44:33 crc kubenswrapper[4717]: I0217 15:44:33.846718 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:44:33 crc kubenswrapper[4717]: E0217 15:44:33.847524 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:44:46 crc kubenswrapper[4717]: I0217 15:44:46.847429 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:44:46 crc kubenswrapper[4717]: E0217 15:44:46.848688 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.169965 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97"] Feb 17 15:45:00 crc kubenswrapper[4717]: E0217 15:45:00.170964 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93a2340-3d25-40e9-bbf2-95fb6dc812ee" containerName="extract-utilities" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.170981 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93a2340-3d25-40e9-bbf2-95fb6dc812ee" containerName="extract-utilities" Feb 17 15:45:00 crc kubenswrapper[4717]: E0217 15:45:00.171011 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93a2340-3d25-40e9-bbf2-95fb6dc812ee" containerName="registry-server" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.171019 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93a2340-3d25-40e9-bbf2-95fb6dc812ee" containerName="registry-server" Feb 17 15:45:00 crc kubenswrapper[4717]: E0217 15:45:00.171043 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93a2340-3d25-40e9-bbf2-95fb6dc812ee" containerName="extract-content" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.171054 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93a2340-3d25-40e9-bbf2-95fb6dc812ee" containerName="extract-content" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.171321 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93a2340-3d25-40e9-bbf2-95fb6dc812ee" containerName="registry-server" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.172043 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.177823 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.179248 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.190057 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97"] Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.253471 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-config-volume\") pod \"collect-profiles-29522385-6mq97\" (UID: \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.253607 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-secret-volume\") pod \"collect-profiles-29522385-6mq97\" (UID: \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.253666 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvqts\" (UniqueName: \"kubernetes.io/projected/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-kube-api-access-rvqts\") pod \"collect-profiles-29522385-6mq97\" (UID: \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.355252 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvqts\" (UniqueName: \"kubernetes.io/projected/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-kube-api-access-rvqts\") pod \"collect-profiles-29522385-6mq97\" (UID: \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.355346 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-config-volume\") pod \"collect-profiles-29522385-6mq97\" (UID: \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.355407 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-secret-volume\") pod \"collect-profiles-29522385-6mq97\" (UID: \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.356512 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-config-volume\") pod \"collect-profiles-29522385-6mq97\" (UID: \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.367194 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-secret-volume\") pod \"collect-profiles-29522385-6mq97\" (UID: \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.376997 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvqts\" (UniqueName: \"kubernetes.io/projected/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-kube-api-access-rvqts\") pod \"collect-profiles-29522385-6mq97\" (UID: \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.503647 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:00 crc kubenswrapper[4717]: I0217 15:45:00.847255 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:45:00 crc kubenswrapper[4717]: E0217 15:45:00.847890 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:45:01 crc kubenswrapper[4717]: I0217 15:45:01.024581 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97"] Feb 17 15:45:01 crc kubenswrapper[4717]: I0217 15:45:01.067179 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" event={"ID":"74a5cc88-33cf-41c7-80e4-84db62b7eeb6","Type":"ContainerStarted","Data":"bf60b1924e746a34670e757f817ac7be29c91fea9b959ce8ca2498d6331e6022"} Feb 17 15:45:02 crc kubenswrapper[4717]: I0217 15:45:02.080760 4717 generic.go:334] "Generic (PLEG): container finished" podID="74a5cc88-33cf-41c7-80e4-84db62b7eeb6" containerID="73742159df0cd6bb638df2fa22dbfd38bb89535c689242f1116814607de4b887" exitCode=0 Feb 17 15:45:02 crc kubenswrapper[4717]: I0217 15:45:02.080860 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" event={"ID":"74a5cc88-33cf-41c7-80e4-84db62b7eeb6","Type":"ContainerDied","Data":"73742159df0cd6bb638df2fa22dbfd38bb89535c689242f1116814607de4b887"} Feb 17 15:45:03 crc kubenswrapper[4717]: I0217 15:45:03.561906 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:03 crc kubenswrapper[4717]: I0217 15:45:03.635533 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-config-volume\") pod \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\" (UID: \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\") " Feb 17 15:45:03 crc kubenswrapper[4717]: I0217 15:45:03.635607 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-secret-volume\") pod \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\" (UID: \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\") " Feb 17 15:45:03 crc kubenswrapper[4717]: I0217 15:45:03.635765 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvqts\" (UniqueName: \"kubernetes.io/projected/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-kube-api-access-rvqts\") pod \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\" (UID: \"74a5cc88-33cf-41c7-80e4-84db62b7eeb6\") " Feb 17 15:45:03 crc kubenswrapper[4717]: I0217 15:45:03.637121 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-config-volume" (OuterVolumeSpecName: "config-volume") pod "74a5cc88-33cf-41c7-80e4-84db62b7eeb6" (UID: "74a5cc88-33cf-41c7-80e4-84db62b7eeb6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:45:03 crc kubenswrapper[4717]: I0217 15:45:03.643988 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-kube-api-access-rvqts" (OuterVolumeSpecName: "kube-api-access-rvqts") pod "74a5cc88-33cf-41c7-80e4-84db62b7eeb6" (UID: "74a5cc88-33cf-41c7-80e4-84db62b7eeb6"). InnerVolumeSpecName "kube-api-access-rvqts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:45:03 crc kubenswrapper[4717]: I0217 15:45:03.650285 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "74a5cc88-33cf-41c7-80e4-84db62b7eeb6" (UID: "74a5cc88-33cf-41c7-80e4-84db62b7eeb6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:45:03 crc kubenswrapper[4717]: I0217 15:45:03.739426 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:45:03 crc kubenswrapper[4717]: I0217 15:45:03.739497 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:45:03 crc kubenswrapper[4717]: I0217 15:45:03.739517 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvqts\" (UniqueName: \"kubernetes.io/projected/74a5cc88-33cf-41c7-80e4-84db62b7eeb6-kube-api-access-rvqts\") on node \"crc\" DevicePath \"\"" Feb 17 15:45:04 crc kubenswrapper[4717]: I0217 15:45:04.111011 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" event={"ID":"74a5cc88-33cf-41c7-80e4-84db62b7eeb6","Type":"ContainerDied","Data":"bf60b1924e746a34670e757f817ac7be29c91fea9b959ce8ca2498d6331e6022"} Feb 17 15:45:04 crc kubenswrapper[4717]: I0217 15:45:04.111132 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf60b1924e746a34670e757f817ac7be29c91fea9b959ce8ca2498d6331e6022" Feb 17 15:45:04 crc kubenswrapper[4717]: I0217 15:45:04.111129 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-6mq97" Feb 17 15:45:04 crc kubenswrapper[4717]: I0217 15:45:04.701634 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp"] Feb 17 15:45:04 crc kubenswrapper[4717]: I0217 15:45:04.715438 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522340-vsdzp"] Feb 17 15:45:05 crc kubenswrapper[4717]: I0217 15:45:05.864321 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d67da46-54ca-4904-93f0-e2d985a9cedc" path="/var/lib/kubelet/pods/0d67da46-54ca-4904-93f0-e2d985a9cedc/volumes" Feb 17 15:45:12 crc kubenswrapper[4717]: I0217 15:45:12.549419 4717 scope.go:117] "RemoveContainer" containerID="1182285e21d67d70e1e882b580603d358ec356c4b4e338b5a91afd30ececa508" Feb 17 15:45:12 crc kubenswrapper[4717]: I0217 15:45:12.847154 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:45:12 crc kubenswrapper[4717]: E0217 15:45:12.848043 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:45:24 crc kubenswrapper[4717]: I0217 15:45:24.847909 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:45:24 crc kubenswrapper[4717]: E0217 15:45:24.849420 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:45:36 crc kubenswrapper[4717]: I0217 15:45:36.849261 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:45:36 crc kubenswrapper[4717]: E0217 15:45:36.855319 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:45:49 crc kubenswrapper[4717]: I0217 15:45:49.849021 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:45:49 crc kubenswrapper[4717]: E0217 15:45:49.850503 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:46:04 crc kubenswrapper[4717]: I0217 15:46:04.848981 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:46:04 crc kubenswrapper[4717]: E0217 15:46:04.850173 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:46:15 crc kubenswrapper[4717]: I0217 15:46:15.848177 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:46:15 crc kubenswrapper[4717]: E0217 15:46:15.849238 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:46:29 crc kubenswrapper[4717]: I0217 15:46:29.847908 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:46:29 crc kubenswrapper[4717]: E0217 15:46:29.849164 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:46:40 crc kubenswrapper[4717]: I0217 15:46:40.846865 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:46:40 crc kubenswrapper[4717]: E0217 15:46:40.847599 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:46:52 crc kubenswrapper[4717]: I0217 15:46:52.848116 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:46:52 crc kubenswrapper[4717]: E0217 15:46:52.849637 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:47:07 crc kubenswrapper[4717]: I0217 15:47:07.847659 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:47:07 crc kubenswrapper[4717]: E0217 15:47:07.848495 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:47:19 crc kubenswrapper[4717]: I0217 15:47:19.847045 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:47:19 crc kubenswrapper[4717]: E0217 15:47:19.848431 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:47:34 crc kubenswrapper[4717]: I0217 15:47:34.847340 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:47:34 crc kubenswrapper[4717]: E0217 15:47:34.848104 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:47:48 crc kubenswrapper[4717]: I0217 15:47:48.847555 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:47:48 crc kubenswrapper[4717]: E0217 15:47:48.848578 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:48:03 crc kubenswrapper[4717]: I0217 15:48:03.847301 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:48:03 crc kubenswrapper[4717]: E0217 15:48:03.848373 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:48:14 crc kubenswrapper[4717]: I0217 15:48:14.846709 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:48:14 crc kubenswrapper[4717]: E0217 15:48:14.847415 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.688083 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cxbg6"] Feb 17 15:48:15 crc kubenswrapper[4717]: E0217 15:48:15.688506 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a5cc88-33cf-41c7-80e4-84db62b7eeb6" containerName="collect-profiles" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.688523 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a5cc88-33cf-41c7-80e4-84db62b7eeb6" containerName="collect-profiles" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.688762 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a5cc88-33cf-41c7-80e4-84db62b7eeb6" containerName="collect-profiles" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.689970 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.703133 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxbg6"] Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.815221 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlf2t\" (UniqueName: \"kubernetes.io/projected/454d2d52-d8e3-4069-8b17-36622c4f7561-kube-api-access-xlf2t\") pod \"redhat-operators-cxbg6\" (UID: \"454d2d52-d8e3-4069-8b17-36622c4f7561\") " pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.815759 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454d2d52-d8e3-4069-8b17-36622c4f7561-utilities\") pod \"redhat-operators-cxbg6\" (UID: \"454d2d52-d8e3-4069-8b17-36622c4f7561\") " pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.816047 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454d2d52-d8e3-4069-8b17-36622c4f7561-catalog-content\") pod \"redhat-operators-cxbg6\" (UID: \"454d2d52-d8e3-4069-8b17-36622c4f7561\") " pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.918185 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlf2t\" (UniqueName: \"kubernetes.io/projected/454d2d52-d8e3-4069-8b17-36622c4f7561-kube-api-access-xlf2t\") pod \"redhat-operators-cxbg6\" (UID: \"454d2d52-d8e3-4069-8b17-36622c4f7561\") " pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.918409 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454d2d52-d8e3-4069-8b17-36622c4f7561-utilities\") pod \"redhat-operators-cxbg6\" (UID: \"454d2d52-d8e3-4069-8b17-36622c4f7561\") " pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.918502 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454d2d52-d8e3-4069-8b17-36622c4f7561-catalog-content\") pod \"redhat-operators-cxbg6\" (UID: \"454d2d52-d8e3-4069-8b17-36622c4f7561\") " pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.919145 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454d2d52-d8e3-4069-8b17-36622c4f7561-catalog-content\") pod \"redhat-operators-cxbg6\" (UID: \"454d2d52-d8e3-4069-8b17-36622c4f7561\") " pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.920535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454d2d52-d8e3-4069-8b17-36622c4f7561-utilities\") pod \"redhat-operators-cxbg6\" (UID: \"454d2d52-d8e3-4069-8b17-36622c4f7561\") " pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:15 crc kubenswrapper[4717]: I0217 15:48:15.948107 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlf2t\" (UniqueName: \"kubernetes.io/projected/454d2d52-d8e3-4069-8b17-36622c4f7561-kube-api-access-xlf2t\") pod \"redhat-operators-cxbg6\" (UID: \"454d2d52-d8e3-4069-8b17-36622c4f7561\") " pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:16 crc kubenswrapper[4717]: I0217 15:48:16.007231 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:16 crc kubenswrapper[4717]: I0217 15:48:16.524041 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxbg6"] Feb 17 15:48:17 crc kubenswrapper[4717]: I0217 15:48:17.171487 4717 generic.go:334] "Generic (PLEG): container finished" podID="454d2d52-d8e3-4069-8b17-36622c4f7561" containerID="c8146637f2dff1424fd7a8eecf3506e61fe28eb667c9717abf03a7026fda9435" exitCode=0 Feb 17 15:48:17 crc kubenswrapper[4717]: I0217 15:48:17.171593 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxbg6" event={"ID":"454d2d52-d8e3-4069-8b17-36622c4f7561","Type":"ContainerDied","Data":"c8146637f2dff1424fd7a8eecf3506e61fe28eb667c9717abf03a7026fda9435"} Feb 17 15:48:17 crc kubenswrapper[4717]: I0217 15:48:17.171816 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxbg6" event={"ID":"454d2d52-d8e3-4069-8b17-36622c4f7561","Type":"ContainerStarted","Data":"d787bbb3447e0d3560a3701156ba5ba309e341da45b07c813bb128b856f3179b"} Feb 17 15:48:17 crc kubenswrapper[4717]: I0217 15:48:17.173240 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:48:24 crc kubenswrapper[4717]: I0217 15:48:24.233310 4717 generic.go:334] "Generic (PLEG): container finished" podID="6bc8b07d-6032-43fa-821d-fa1685427d56" containerID="c93e579fc219ac3b634cde83769127c7881ea521e0818922f7bea1f386da9148" exitCode=0 Feb 17 15:48:24 crc kubenswrapper[4717]: I0217 15:48:24.233370 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6bc8b07d-6032-43fa-821d-fa1685427d56","Type":"ContainerDied","Data":"c93e579fc219ac3b634cde83769127c7881ea521e0818922f7bea1f386da9148"} Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.245399 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.292784 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-ca-certs\") pod \"6bc8b07d-6032-43fa-821d-fa1685427d56\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.292870 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-ssh-key\") pod \"6bc8b07d-6032-43fa-821d-fa1685427d56\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.292936 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"6bc8b07d-6032-43fa-821d-fa1685427d56\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.292983 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bc8b07d-6032-43fa-821d-fa1685427d56-openstack-config\") pod \"6bc8b07d-6032-43fa-821d-fa1685427d56\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.293030 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6bc8b07d-6032-43fa-821d-fa1685427d56-test-operator-ephemeral-workdir\") pod \"6bc8b07d-6032-43fa-821d-fa1685427d56\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.293072 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bc8b07d-6032-43fa-821d-fa1685427d56-config-data\") pod \"6bc8b07d-6032-43fa-821d-fa1685427d56\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.293259 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-openstack-config-secret\") pod \"6bc8b07d-6032-43fa-821d-fa1685427d56\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.293356 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz2p7\" (UniqueName: \"kubernetes.io/projected/6bc8b07d-6032-43fa-821d-fa1685427d56-kube-api-access-jz2p7\") pod \"6bc8b07d-6032-43fa-821d-fa1685427d56\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.293408 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6bc8b07d-6032-43fa-821d-fa1685427d56-test-operator-ephemeral-temporary\") pod \"6bc8b07d-6032-43fa-821d-fa1685427d56\" (UID: \"6bc8b07d-6032-43fa-821d-fa1685427d56\") " Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.294806 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc8b07d-6032-43fa-821d-fa1685427d56-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "6bc8b07d-6032-43fa-821d-fa1685427d56" (UID: "6bc8b07d-6032-43fa-821d-fa1685427d56"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.297413 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc8b07d-6032-43fa-821d-fa1685427d56-config-data" (OuterVolumeSpecName: "config-data") pod "6bc8b07d-6032-43fa-821d-fa1685427d56" (UID: "6bc8b07d-6032-43fa-821d-fa1685427d56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.308921 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "6bc8b07d-6032-43fa-821d-fa1685427d56" (UID: "6bc8b07d-6032-43fa-821d-fa1685427d56"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.309752 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc8b07d-6032-43fa-821d-fa1685427d56-kube-api-access-jz2p7" (OuterVolumeSpecName: "kube-api-access-jz2p7") pod "6bc8b07d-6032-43fa-821d-fa1685427d56" (UID: "6bc8b07d-6032-43fa-821d-fa1685427d56"). InnerVolumeSpecName "kube-api-access-jz2p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.312822 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6bc8b07d-6032-43fa-821d-fa1685427d56","Type":"ContainerDied","Data":"2aa15342fc90321cf3f8ad127cceeddfd36580b5451edef92f44c1355db5efa9"} Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.312891 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aa15342fc90321cf3f8ad127cceeddfd36580b5451edef92f44c1355db5efa9" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.312991 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.324994 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc8b07d-6032-43fa-821d-fa1685427d56-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "6bc8b07d-6032-43fa-821d-fa1685427d56" (UID: "6bc8b07d-6032-43fa-821d-fa1685427d56"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.327969 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "6bc8b07d-6032-43fa-821d-fa1685427d56" (UID: "6bc8b07d-6032-43fa-821d-fa1685427d56"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.345210 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6bc8b07d-6032-43fa-821d-fa1685427d56" (UID: "6bc8b07d-6032-43fa-821d-fa1685427d56"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.345250 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6bc8b07d-6032-43fa-821d-fa1685427d56" (UID: "6bc8b07d-6032-43fa-821d-fa1685427d56"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.373939 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc8b07d-6032-43fa-821d-fa1685427d56-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6bc8b07d-6032-43fa-821d-fa1685427d56" (UID: "6bc8b07d-6032-43fa-821d-fa1685427d56"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.395861 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bc8b07d-6032-43fa-821d-fa1685427d56-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.395891 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.395903 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz2p7\" (UniqueName: \"kubernetes.io/projected/6bc8b07d-6032-43fa-821d-fa1685427d56-kube-api-access-jz2p7\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.395912 4717 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6bc8b07d-6032-43fa-821d-fa1685427d56-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.395923 4717 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.395931 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6bc8b07d-6032-43fa-821d-fa1685427d56-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.395962 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.395971 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bc8b07d-6032-43fa-821d-fa1685427d56-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.395980 4717 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6bc8b07d-6032-43fa-821d-fa1685427d56-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.415170 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.497816 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:29 crc kubenswrapper[4717]: I0217 15:48:29.846967 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:48:29 crc kubenswrapper[4717]: E0217 15:48:29.847363 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:48:30 crc kubenswrapper[4717]: I0217 15:48:30.328113 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxbg6" event={"ID":"454d2d52-d8e3-4069-8b17-36622c4f7561","Type":"ContainerStarted","Data":"6f3f51d8df9a5f5e69f3e159ebbfdc7a9ad7c367b854131cfa66da4da0072e54"} Feb 17 15:48:32 crc kubenswrapper[4717]: I0217 15:48:32.350631 4717 generic.go:334] "Generic (PLEG): container finished" podID="454d2d52-d8e3-4069-8b17-36622c4f7561" containerID="6f3f51d8df9a5f5e69f3e159ebbfdc7a9ad7c367b854131cfa66da4da0072e54" exitCode=0 Feb 17 15:48:32 crc kubenswrapper[4717]: I0217 15:48:32.350706 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxbg6" event={"ID":"454d2d52-d8e3-4069-8b17-36622c4f7561","Type":"ContainerDied","Data":"6f3f51d8df9a5f5e69f3e159ebbfdc7a9ad7c367b854131cfa66da4da0072e54"} Feb 17 15:48:33 crc kubenswrapper[4717]: I0217 15:48:33.367599 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxbg6" event={"ID":"454d2d52-d8e3-4069-8b17-36622c4f7561","Type":"ContainerStarted","Data":"7ee81af99ca4ae1f71548bf3ecbff4969c9eaf1ce43ee5a1f609c2919f38c278"} Feb 17 15:48:33 crc kubenswrapper[4717]: I0217 15:48:33.401021 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cxbg6" podStartSLOduration=2.620488864 podStartE2EDuration="18.40098913s" podCreationTimestamp="2026-02-17 15:48:15 +0000 UTC" firstStartedPulling="2026-02-17 15:48:17.17298918 +0000 UTC m=+3363.588829656" lastFinishedPulling="2026-02-17 15:48:32.953489446 +0000 UTC m=+3379.369329922" observedRunningTime="2026-02-17 15:48:33.389925118 +0000 UTC m=+3379.805765624" watchObservedRunningTime="2026-02-17 15:48:33.40098913 +0000 UTC m=+3379.816829636" Feb 17 15:48:36 crc kubenswrapper[4717]: I0217 15:48:36.008329 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:36 crc kubenswrapper[4717]: I0217 15:48:36.008668 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.080041 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cxbg6" podUID="454d2d52-d8e3-4069-8b17-36622c4f7561" containerName="registry-server" probeResult="failure" output=< Feb 17 15:48:37 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 17 15:48:37 crc kubenswrapper[4717]: > Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.602150 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 15:48:37 crc kubenswrapper[4717]: E0217 15:48:37.603192 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc8b07d-6032-43fa-821d-fa1685427d56" containerName="tempest-tests-tempest-tests-runner" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.603208 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc8b07d-6032-43fa-821d-fa1685427d56" containerName="tempest-tests-tempest-tests-runner" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.603467 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc8b07d-6032-43fa-821d-fa1685427d56" containerName="tempest-tests-tempest-tests-runner" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.604167 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.607542 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-2lnjw" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.619365 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.684314 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cm4t\" (UniqueName: \"kubernetes.io/projected/2f75b776-5a46-4241-adb5-eb50dbc8ba0f-kube-api-access-7cm4t\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2f75b776-5a46-4241-adb5-eb50dbc8ba0f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.684567 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2f75b776-5a46-4241-adb5-eb50dbc8ba0f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.786627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cm4t\" (UniqueName: \"kubernetes.io/projected/2f75b776-5a46-4241-adb5-eb50dbc8ba0f-kube-api-access-7cm4t\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2f75b776-5a46-4241-adb5-eb50dbc8ba0f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.787158 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2f75b776-5a46-4241-adb5-eb50dbc8ba0f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.787596 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2f75b776-5a46-4241-adb5-eb50dbc8ba0f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.810374 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cm4t\" (UniqueName: \"kubernetes.io/projected/2f75b776-5a46-4241-adb5-eb50dbc8ba0f-kube-api-access-7cm4t\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2f75b776-5a46-4241-adb5-eb50dbc8ba0f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.821349 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2f75b776-5a46-4241-adb5-eb50dbc8ba0f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 15:48:37 crc kubenswrapper[4717]: I0217 15:48:37.935913 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 15:48:38 crc kubenswrapper[4717]: I0217 15:48:38.451966 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 15:48:39 crc kubenswrapper[4717]: I0217 15:48:39.436154 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2f75b776-5a46-4241-adb5-eb50dbc8ba0f","Type":"ContainerStarted","Data":"43bcd07bae2be7e9159fd8569c04784d994860be717de3aa0ff8468658b636fb"} Feb 17 15:48:40 crc kubenswrapper[4717]: I0217 15:48:40.456568 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2f75b776-5a46-4241-adb5-eb50dbc8ba0f","Type":"ContainerStarted","Data":"22fcae9e3eba0db13b391e7e4922e840bdd4f536504ab88107fd4f3ba2b69f8b"} Feb 17 15:48:40 crc kubenswrapper[4717]: I0217 15:48:40.480456 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.5648238819999998 podStartE2EDuration="3.480429403s" podCreationTimestamp="2026-02-17 15:48:37 +0000 UTC" firstStartedPulling="2026-02-17 15:48:38.445626908 +0000 UTC m=+3384.861467414" lastFinishedPulling="2026-02-17 15:48:39.361232459 +0000 UTC m=+3385.777072935" observedRunningTime="2026-02-17 15:48:40.473390394 +0000 UTC m=+3386.889230880" watchObservedRunningTime="2026-02-17 15:48:40.480429403 +0000 UTC m=+3386.896269919" Feb 17 15:48:42 crc kubenswrapper[4717]: I0217 15:48:42.846931 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:48:42 crc kubenswrapper[4717]: E0217 15:48:42.847610 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:48:46 crc kubenswrapper[4717]: I0217 15:48:46.071302 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:46 crc kubenswrapper[4717]: I0217 15:48:46.137308 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cxbg6" Feb 17 15:48:46 crc kubenswrapper[4717]: I0217 15:48:46.743496 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxbg6"] Feb 17 15:48:46 crc kubenswrapper[4717]: I0217 15:48:46.898635 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jb79c"] Feb 17 15:48:46 crc kubenswrapper[4717]: I0217 15:48:46.899303 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jb79c" podUID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerName="registry-server" containerID="cri-o://eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec" gracePeriod=2 Feb 17 15:48:47 crc kubenswrapper[4717]: E0217 15:48:47.737548 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec is running failed: container process not found" containerID="eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 15:48:47 crc kubenswrapper[4717]: E0217 15:48:47.737978 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec is running failed: container process not found" containerID="eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 15:48:47 crc kubenswrapper[4717]: E0217 15:48:47.738849 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec is running failed: container process not found" containerID="eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 15:48:47 crc kubenswrapper[4717]: E0217 15:48:47.738888 4717 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-jb79c" podUID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerName="registry-server" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.146479 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.235308 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ed0865-16aa-42df-94e8-c228b0baf920-utilities\") pod \"f2ed0865-16aa-42df-94e8-c228b0baf920\" (UID: \"f2ed0865-16aa-42df-94e8-c228b0baf920\") " Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.235434 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c9bk\" (UniqueName: \"kubernetes.io/projected/f2ed0865-16aa-42df-94e8-c228b0baf920-kube-api-access-6c9bk\") pod \"f2ed0865-16aa-42df-94e8-c228b0baf920\" (UID: \"f2ed0865-16aa-42df-94e8-c228b0baf920\") " Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.235523 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ed0865-16aa-42df-94e8-c228b0baf920-catalog-content\") pod \"f2ed0865-16aa-42df-94e8-c228b0baf920\" (UID: \"f2ed0865-16aa-42df-94e8-c228b0baf920\") " Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.236392 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2ed0865-16aa-42df-94e8-c228b0baf920-utilities" (OuterVolumeSpecName: "utilities") pod "f2ed0865-16aa-42df-94e8-c228b0baf920" (UID: "f2ed0865-16aa-42df-94e8-c228b0baf920"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.241907 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2ed0865-16aa-42df-94e8-c228b0baf920-kube-api-access-6c9bk" (OuterVolumeSpecName: "kube-api-access-6c9bk") pod "f2ed0865-16aa-42df-94e8-c228b0baf920" (UID: "f2ed0865-16aa-42df-94e8-c228b0baf920"). InnerVolumeSpecName "kube-api-access-6c9bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.339302 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ed0865-16aa-42df-94e8-c228b0baf920-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.339335 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c9bk\" (UniqueName: \"kubernetes.io/projected/f2ed0865-16aa-42df-94e8-c228b0baf920-kube-api-access-6c9bk\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.367269 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2ed0865-16aa-42df-94e8-c228b0baf920-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2ed0865-16aa-42df-94e8-c228b0baf920" (UID: "f2ed0865-16aa-42df-94e8-c228b0baf920"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.441038 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ed0865-16aa-42df-94e8-c228b0baf920-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.557165 4717 generic.go:334] "Generic (PLEG): container finished" podID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerID="eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec" exitCode=0 Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.557215 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb79c" event={"ID":"f2ed0865-16aa-42df-94e8-c228b0baf920","Type":"ContainerDied","Data":"eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec"} Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.557275 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb79c" event={"ID":"f2ed0865-16aa-42df-94e8-c228b0baf920","Type":"ContainerDied","Data":"566d8ff9ae0765fe05847e36a3298e428f46b57fbe0b4094e3c64033710a7333"} Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.557298 4717 scope.go:117] "RemoveContainer" containerID="eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.557679 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jb79c" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.576231 4717 scope.go:117] "RemoveContainer" containerID="b957085354d7503c278d5292e787777409672d3bdafa1a0351e82f0278d16162" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.595212 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jb79c"] Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.602111 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jb79c"] Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.606259 4717 scope.go:117] "RemoveContainer" containerID="1a25c373aec7d594536cd1fb01e6aa1575bbdc3e9554f50fbf21f0cc49bf6f35" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.648375 4717 scope.go:117] "RemoveContainer" containerID="eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec" Feb 17 15:48:48 crc kubenswrapper[4717]: E0217 15:48:48.648866 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec\": container with ID starting with eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec not found: ID does not exist" containerID="eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.648911 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec"} err="failed to get container status \"eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec\": rpc error: code = NotFound desc = could not find container \"eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec\": container with ID starting with eab81b6bbfb2ed6fd16116ec7736efe9744053af52ccf815b9ea16339275ebec not found: ID does not exist" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.648941 4717 scope.go:117] "RemoveContainer" containerID="b957085354d7503c278d5292e787777409672d3bdafa1a0351e82f0278d16162" Feb 17 15:48:48 crc kubenswrapper[4717]: E0217 15:48:48.649383 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b957085354d7503c278d5292e787777409672d3bdafa1a0351e82f0278d16162\": container with ID starting with b957085354d7503c278d5292e787777409672d3bdafa1a0351e82f0278d16162 not found: ID does not exist" containerID="b957085354d7503c278d5292e787777409672d3bdafa1a0351e82f0278d16162" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.649410 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b957085354d7503c278d5292e787777409672d3bdafa1a0351e82f0278d16162"} err="failed to get container status \"b957085354d7503c278d5292e787777409672d3bdafa1a0351e82f0278d16162\": rpc error: code = NotFound desc = could not find container \"b957085354d7503c278d5292e787777409672d3bdafa1a0351e82f0278d16162\": container with ID starting with b957085354d7503c278d5292e787777409672d3bdafa1a0351e82f0278d16162 not found: ID does not exist" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.649427 4717 scope.go:117] "RemoveContainer" containerID="1a25c373aec7d594536cd1fb01e6aa1575bbdc3e9554f50fbf21f0cc49bf6f35" Feb 17 15:48:48 crc kubenswrapper[4717]: E0217 15:48:48.649639 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a25c373aec7d594536cd1fb01e6aa1575bbdc3e9554f50fbf21f0cc49bf6f35\": container with ID starting with 1a25c373aec7d594536cd1fb01e6aa1575bbdc3e9554f50fbf21f0cc49bf6f35 not found: ID does not exist" containerID="1a25c373aec7d594536cd1fb01e6aa1575bbdc3e9554f50fbf21f0cc49bf6f35" Feb 17 15:48:48 crc kubenswrapper[4717]: I0217 15:48:48.649659 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a25c373aec7d594536cd1fb01e6aa1575bbdc3e9554f50fbf21f0cc49bf6f35"} err="failed to get container status \"1a25c373aec7d594536cd1fb01e6aa1575bbdc3e9554f50fbf21f0cc49bf6f35\": rpc error: code = NotFound desc = could not find container \"1a25c373aec7d594536cd1fb01e6aa1575bbdc3e9554f50fbf21f0cc49bf6f35\": container with ID starting with 1a25c373aec7d594536cd1fb01e6aa1575bbdc3e9554f50fbf21f0cc49bf6f35 not found: ID does not exist" Feb 17 15:48:49 crc kubenswrapper[4717]: I0217 15:48:49.858722 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2ed0865-16aa-42df-94e8-c228b0baf920" path="/var/lib/kubelet/pods/f2ed0865-16aa-42df-94e8-c228b0baf920/volumes" Feb 17 15:48:57 crc kubenswrapper[4717]: I0217 15:48:57.847810 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:48:57 crc kubenswrapper[4717]: E0217 15:48:57.848818 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.379568 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dwwm7/must-gather-ftttb"] Feb 17 15:49:02 crc kubenswrapper[4717]: E0217 15:49:02.381093 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerName="extract-content" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.381332 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerName="extract-content" Feb 17 15:49:02 crc kubenswrapper[4717]: E0217 15:49:02.381374 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerName="registry-server" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.381380 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerName="registry-server" Feb 17 15:49:02 crc kubenswrapper[4717]: E0217 15:49:02.381396 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerName="extract-utilities" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.381403 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerName="extract-utilities" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.381587 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ed0865-16aa-42df-94e8-c228b0baf920" containerName="registry-server" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.382538 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/must-gather-ftttb" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.386681 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dwwm7"/"kube-root-ca.crt" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.386954 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dwwm7"/"openshift-service-ca.crt" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.389932 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dwwm7/must-gather-ftttb"] Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.526783 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlxns\" (UniqueName: \"kubernetes.io/projected/76d22ade-8acb-4a71-a950-880de86ec228-kube-api-access-rlxns\") pod \"must-gather-ftttb\" (UID: \"76d22ade-8acb-4a71-a950-880de86ec228\") " pod="openshift-must-gather-dwwm7/must-gather-ftttb" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.526835 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76d22ade-8acb-4a71-a950-880de86ec228-must-gather-output\") pod \"must-gather-ftttb\" (UID: \"76d22ade-8acb-4a71-a950-880de86ec228\") " pod="openshift-must-gather-dwwm7/must-gather-ftttb" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.629113 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlxns\" (UniqueName: \"kubernetes.io/projected/76d22ade-8acb-4a71-a950-880de86ec228-kube-api-access-rlxns\") pod \"must-gather-ftttb\" (UID: \"76d22ade-8acb-4a71-a950-880de86ec228\") " pod="openshift-must-gather-dwwm7/must-gather-ftttb" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.629159 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76d22ade-8acb-4a71-a950-880de86ec228-must-gather-output\") pod \"must-gather-ftttb\" (UID: \"76d22ade-8acb-4a71-a950-880de86ec228\") " pod="openshift-must-gather-dwwm7/must-gather-ftttb" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.629639 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76d22ade-8acb-4a71-a950-880de86ec228-must-gather-output\") pod \"must-gather-ftttb\" (UID: \"76d22ade-8acb-4a71-a950-880de86ec228\") " pod="openshift-must-gather-dwwm7/must-gather-ftttb" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.650435 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlxns\" (UniqueName: \"kubernetes.io/projected/76d22ade-8acb-4a71-a950-880de86ec228-kube-api-access-rlxns\") pod \"must-gather-ftttb\" (UID: \"76d22ade-8acb-4a71-a950-880de86ec228\") " pod="openshift-must-gather-dwwm7/must-gather-ftttb" Feb 17 15:49:02 crc kubenswrapper[4717]: I0217 15:49:02.710239 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/must-gather-ftttb" Feb 17 15:49:03 crc kubenswrapper[4717]: I0217 15:49:03.188836 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dwwm7/must-gather-ftttb"] Feb 17 15:49:03 crc kubenswrapper[4717]: I0217 15:49:03.725702 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dwwm7/must-gather-ftttb" event={"ID":"76d22ade-8acb-4a71-a950-880de86ec228","Type":"ContainerStarted","Data":"d36f6d449c2f582e070d088c9bb421684266174f1c27c4fc6e4dee1431d77600"} Feb 17 15:49:08 crc kubenswrapper[4717]: I0217 15:49:08.847262 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:49:08 crc kubenswrapper[4717]: E0217 15:49:08.848202 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:49:11 crc kubenswrapper[4717]: I0217 15:49:11.802472 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dwwm7/must-gather-ftttb" event={"ID":"76d22ade-8acb-4a71-a950-880de86ec228","Type":"ContainerStarted","Data":"af11359abd1d087183410a62eef3e53589a6f7c8e6330eeeef37487156ec6c22"} Feb 17 15:49:11 crc kubenswrapper[4717]: I0217 15:49:11.803001 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dwwm7/must-gather-ftttb" event={"ID":"76d22ade-8acb-4a71-a950-880de86ec228","Type":"ContainerStarted","Data":"f39a570357654eb2ef5125f5c68fb0c322352e63b91ba71edb7b6d258ea0cbb0"} Feb 17 15:49:11 crc kubenswrapper[4717]: I0217 15:49:11.831975 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dwwm7/must-gather-ftttb" podStartSLOduration=2.190918873 podStartE2EDuration="9.831954654s" podCreationTimestamp="2026-02-17 15:49:02 +0000 UTC" firstStartedPulling="2026-02-17 15:49:03.195002413 +0000 UTC m=+3409.610842929" lastFinishedPulling="2026-02-17 15:49:10.836038224 +0000 UTC m=+3417.251878710" observedRunningTime="2026-02-17 15:49:11.819813231 +0000 UTC m=+3418.235653707" watchObservedRunningTime="2026-02-17 15:49:11.831954654 +0000 UTC m=+3418.247795140" Feb 17 15:49:14 crc kubenswrapper[4717]: I0217 15:49:14.773218 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dwwm7/crc-debug-gk6q2"] Feb 17 15:49:14 crc kubenswrapper[4717]: I0217 15:49:14.774875 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" Feb 17 15:49:14 crc kubenswrapper[4717]: I0217 15:49:14.781539 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-dwwm7"/"default-dockercfg-gkfm5" Feb 17 15:49:14 crc kubenswrapper[4717]: I0217 15:49:14.859557 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nntcr\" (UniqueName: \"kubernetes.io/projected/740a901f-8b10-42b7-a747-8c405d4cc01f-kube-api-access-nntcr\") pod \"crc-debug-gk6q2\" (UID: \"740a901f-8b10-42b7-a747-8c405d4cc01f\") " pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" Feb 17 15:49:14 crc kubenswrapper[4717]: I0217 15:49:14.859846 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/740a901f-8b10-42b7-a747-8c405d4cc01f-host\") pod \"crc-debug-gk6q2\" (UID: \"740a901f-8b10-42b7-a747-8c405d4cc01f\") " pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" Feb 17 15:49:14 crc kubenswrapper[4717]: I0217 15:49:14.961561 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nntcr\" (UniqueName: \"kubernetes.io/projected/740a901f-8b10-42b7-a747-8c405d4cc01f-kube-api-access-nntcr\") pod \"crc-debug-gk6q2\" (UID: \"740a901f-8b10-42b7-a747-8c405d4cc01f\") " pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" Feb 17 15:49:14 crc kubenswrapper[4717]: I0217 15:49:14.961681 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/740a901f-8b10-42b7-a747-8c405d4cc01f-host\") pod \"crc-debug-gk6q2\" (UID: \"740a901f-8b10-42b7-a747-8c405d4cc01f\") " pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" Feb 17 15:49:14 crc kubenswrapper[4717]: I0217 15:49:14.961882 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/740a901f-8b10-42b7-a747-8c405d4cc01f-host\") pod \"crc-debug-gk6q2\" (UID: \"740a901f-8b10-42b7-a747-8c405d4cc01f\") " pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" Feb 17 15:49:14 crc kubenswrapper[4717]: I0217 15:49:14.981820 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nntcr\" (UniqueName: \"kubernetes.io/projected/740a901f-8b10-42b7-a747-8c405d4cc01f-kube-api-access-nntcr\") pod \"crc-debug-gk6q2\" (UID: \"740a901f-8b10-42b7-a747-8c405d4cc01f\") " pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" Feb 17 15:49:15 crc kubenswrapper[4717]: I0217 15:49:15.093496 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" Feb 17 15:49:15 crc kubenswrapper[4717]: W0217 15:49:15.125812 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod740a901f_8b10_42b7_a747_8c405d4cc01f.slice/crio-55544a8e15a39f11672560168d8a2c823e7cb34cee722a2074e24a0cb7674b54 WatchSource:0}: Error finding container 55544a8e15a39f11672560168d8a2c823e7cb34cee722a2074e24a0cb7674b54: Status 404 returned error can't find the container with id 55544a8e15a39f11672560168d8a2c823e7cb34cee722a2074e24a0cb7674b54 Feb 17 15:49:15 crc kubenswrapper[4717]: I0217 15:49:15.872531 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" event={"ID":"740a901f-8b10-42b7-a747-8c405d4cc01f","Type":"ContainerStarted","Data":"55544a8e15a39f11672560168d8a2c823e7cb34cee722a2074e24a0cb7674b54"} Feb 17 15:49:23 crc kubenswrapper[4717]: I0217 15:49:23.847115 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:49:25 crc kubenswrapper[4717]: I0217 15:49:25.947462 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"5358db44d6f61af277bafa36114e49aed7d2fde0271825a0b49956b39c5a8b5a"} Feb 17 15:49:25 crc kubenswrapper[4717]: I0217 15:49:25.949455 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" event={"ID":"740a901f-8b10-42b7-a747-8c405d4cc01f","Type":"ContainerStarted","Data":"928777e45c7ce3344fcf275b44467c9a5fe5e47f7ad319ea88637a94be76c3a8"} Feb 17 15:49:25 crc kubenswrapper[4717]: I0217 15:49:25.995245 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" podStartSLOduration=2.000228108 podStartE2EDuration="11.995228394s" podCreationTimestamp="2026-02-17 15:49:14 +0000 UTC" firstStartedPulling="2026-02-17 15:49:15.128001816 +0000 UTC m=+3421.543842282" lastFinishedPulling="2026-02-17 15:49:25.123002072 +0000 UTC m=+3431.538842568" observedRunningTime="2026-02-17 15:49:25.994259947 +0000 UTC m=+3432.410100423" watchObservedRunningTime="2026-02-17 15:49:25.995228394 +0000 UTC m=+3432.411068870" Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.580690 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9gf9t"] Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.611617 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gf9t"] Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.611818 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.730034 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbcqc\" (UniqueName: \"kubernetes.io/projected/97724a8a-8be7-4d6f-99e1-d76dbfd82931-kube-api-access-dbcqc\") pod \"redhat-marketplace-9gf9t\" (UID: \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\") " pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.730115 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97724a8a-8be7-4d6f-99e1-d76dbfd82931-catalog-content\") pod \"redhat-marketplace-9gf9t\" (UID: \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\") " pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.730221 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97724a8a-8be7-4d6f-99e1-d76dbfd82931-utilities\") pod \"redhat-marketplace-9gf9t\" (UID: \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\") " pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.831885 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbcqc\" (UniqueName: \"kubernetes.io/projected/97724a8a-8be7-4d6f-99e1-d76dbfd82931-kube-api-access-dbcqc\") pod \"redhat-marketplace-9gf9t\" (UID: \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\") " pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.831944 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97724a8a-8be7-4d6f-99e1-d76dbfd82931-catalog-content\") pod \"redhat-marketplace-9gf9t\" (UID: \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\") " pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.832026 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97724a8a-8be7-4d6f-99e1-d76dbfd82931-utilities\") pod \"redhat-marketplace-9gf9t\" (UID: \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\") " pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.832753 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97724a8a-8be7-4d6f-99e1-d76dbfd82931-utilities\") pod \"redhat-marketplace-9gf9t\" (UID: \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\") " pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.832974 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97724a8a-8be7-4d6f-99e1-d76dbfd82931-catalog-content\") pod \"redhat-marketplace-9gf9t\" (UID: \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\") " pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.855809 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbcqc\" (UniqueName: \"kubernetes.io/projected/97724a8a-8be7-4d6f-99e1-d76dbfd82931-kube-api-access-dbcqc\") pod \"redhat-marketplace-9gf9t\" (UID: \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\") " pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:28 crc kubenswrapper[4717]: I0217 15:49:28.941331 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:29 crc kubenswrapper[4717]: I0217 15:49:29.488970 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gf9t"] Feb 17 15:49:30 crc kubenswrapper[4717]: I0217 15:49:30.013380 4717 generic.go:334] "Generic (PLEG): container finished" podID="97724a8a-8be7-4d6f-99e1-d76dbfd82931" containerID="567d668ca551ff6a6f7aa76135f7e6cb746ed37d8f6ef12a873022d9a789fbbf" exitCode=0 Feb 17 15:49:30 crc kubenswrapper[4717]: I0217 15:49:30.013494 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gf9t" event={"ID":"97724a8a-8be7-4d6f-99e1-d76dbfd82931","Type":"ContainerDied","Data":"567d668ca551ff6a6f7aa76135f7e6cb746ed37d8f6ef12a873022d9a789fbbf"} Feb 17 15:49:30 crc kubenswrapper[4717]: I0217 15:49:30.013688 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gf9t" event={"ID":"97724a8a-8be7-4d6f-99e1-d76dbfd82931","Type":"ContainerStarted","Data":"252a47206c36fa809efd5284735ff047865fdda110370784f7764870c5d2cee3"} Feb 17 15:49:33 crc kubenswrapper[4717]: I0217 15:49:33.058359 4717 generic.go:334] "Generic (PLEG): container finished" podID="97724a8a-8be7-4d6f-99e1-d76dbfd82931" containerID="ea53c6b2c10dccdb41452736563b44b44469f1b382a6913cad333521f6cd0889" exitCode=0 Feb 17 15:49:33 crc kubenswrapper[4717]: I0217 15:49:33.059057 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gf9t" event={"ID":"97724a8a-8be7-4d6f-99e1-d76dbfd82931","Type":"ContainerDied","Data":"ea53c6b2c10dccdb41452736563b44b44469f1b382a6913cad333521f6cd0889"} Feb 17 15:49:33 crc kubenswrapper[4717]: E0217 15:49:33.211322 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97724a8a_8be7_4d6f_99e1_d76dbfd82931.slice/crio-ea53c6b2c10dccdb41452736563b44b44469f1b382a6913cad333521f6cd0889.scope\": RecentStats: unable to find data in memory cache]" Feb 17 15:49:34 crc kubenswrapper[4717]: I0217 15:49:34.072619 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gf9t" event={"ID":"97724a8a-8be7-4d6f-99e1-d76dbfd82931","Type":"ContainerStarted","Data":"6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249"} Feb 17 15:49:34 crc kubenswrapper[4717]: I0217 15:49:34.097578 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9gf9t" podStartSLOduration=2.649421082 podStartE2EDuration="6.097559849s" podCreationTimestamp="2026-02-17 15:49:28 +0000 UTC" firstStartedPulling="2026-02-17 15:49:30.016636477 +0000 UTC m=+3436.432476953" lastFinishedPulling="2026-02-17 15:49:33.464775234 +0000 UTC m=+3439.880615720" observedRunningTime="2026-02-17 15:49:34.092737483 +0000 UTC m=+3440.508577969" watchObservedRunningTime="2026-02-17 15:49:34.097559849 +0000 UTC m=+3440.513400335" Feb 17 15:49:38 crc kubenswrapper[4717]: I0217 15:49:38.943192 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:38 crc kubenswrapper[4717]: I0217 15:49:38.944265 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:39 crc kubenswrapper[4717]: I0217 15:49:39.004014 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:39 crc kubenswrapper[4717]: I0217 15:49:39.187427 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:39 crc kubenswrapper[4717]: I0217 15:49:39.255829 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gf9t"] Feb 17 15:49:41 crc kubenswrapper[4717]: I0217 15:49:41.163979 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9gf9t" podUID="97724a8a-8be7-4d6f-99e1-d76dbfd82931" containerName="registry-server" containerID="cri-o://6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249" gracePeriod=2 Feb 17 15:49:41 crc kubenswrapper[4717]: I0217 15:49:41.692140 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:41 crc kubenswrapper[4717]: I0217 15:49:41.798680 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97724a8a-8be7-4d6f-99e1-d76dbfd82931-utilities\") pod \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\" (UID: \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\") " Feb 17 15:49:41 crc kubenswrapper[4717]: I0217 15:49:41.798780 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97724a8a-8be7-4d6f-99e1-d76dbfd82931-catalog-content\") pod \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\" (UID: \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\") " Feb 17 15:49:41 crc kubenswrapper[4717]: I0217 15:49:41.798872 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbcqc\" (UniqueName: \"kubernetes.io/projected/97724a8a-8be7-4d6f-99e1-d76dbfd82931-kube-api-access-dbcqc\") pod \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\" (UID: \"97724a8a-8be7-4d6f-99e1-d76dbfd82931\") " Feb 17 15:49:41 crc kubenswrapper[4717]: I0217 15:49:41.799850 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97724a8a-8be7-4d6f-99e1-d76dbfd82931-utilities" (OuterVolumeSpecName: "utilities") pod "97724a8a-8be7-4d6f-99e1-d76dbfd82931" (UID: "97724a8a-8be7-4d6f-99e1-d76dbfd82931"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:49:41 crc kubenswrapper[4717]: I0217 15:49:41.812412 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97724a8a-8be7-4d6f-99e1-d76dbfd82931-kube-api-access-dbcqc" (OuterVolumeSpecName: "kube-api-access-dbcqc") pod "97724a8a-8be7-4d6f-99e1-d76dbfd82931" (UID: "97724a8a-8be7-4d6f-99e1-d76dbfd82931"). InnerVolumeSpecName "kube-api-access-dbcqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:49:41 crc kubenswrapper[4717]: I0217 15:49:41.830836 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97724a8a-8be7-4d6f-99e1-d76dbfd82931-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97724a8a-8be7-4d6f-99e1-d76dbfd82931" (UID: "97724a8a-8be7-4d6f-99e1-d76dbfd82931"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:49:41 crc kubenswrapper[4717]: I0217 15:49:41.901221 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97724a8a-8be7-4d6f-99e1-d76dbfd82931-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:49:41 crc kubenswrapper[4717]: I0217 15:49:41.901257 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbcqc\" (UniqueName: \"kubernetes.io/projected/97724a8a-8be7-4d6f-99e1-d76dbfd82931-kube-api-access-dbcqc\") on node \"crc\" DevicePath \"\"" Feb 17 15:49:41 crc kubenswrapper[4717]: I0217 15:49:41.901268 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97724a8a-8be7-4d6f-99e1-d76dbfd82931-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.174533 4717 generic.go:334] "Generic (PLEG): container finished" podID="97724a8a-8be7-4d6f-99e1-d76dbfd82931" containerID="6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249" exitCode=0 Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.174600 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gf9t" event={"ID":"97724a8a-8be7-4d6f-99e1-d76dbfd82931","Type":"ContainerDied","Data":"6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249"} Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.174620 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gf9t" Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.174648 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gf9t" event={"ID":"97724a8a-8be7-4d6f-99e1-d76dbfd82931","Type":"ContainerDied","Data":"252a47206c36fa809efd5284735ff047865fdda110370784f7764870c5d2cee3"} Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.174669 4717 scope.go:117] "RemoveContainer" containerID="6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249" Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.199969 4717 scope.go:117] "RemoveContainer" containerID="ea53c6b2c10dccdb41452736563b44b44469f1b382a6913cad333521f6cd0889" Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.206908 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gf9t"] Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.215585 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gf9t"] Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.227968 4717 scope.go:117] "RemoveContainer" containerID="567d668ca551ff6a6f7aa76135f7e6cb746ed37d8f6ef12a873022d9a789fbbf" Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.280479 4717 scope.go:117] "RemoveContainer" containerID="6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249" Feb 17 15:49:42 crc kubenswrapper[4717]: E0217 15:49:42.281140 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249\": container with ID starting with 6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249 not found: ID does not exist" containerID="6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249" Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.281169 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249"} err="failed to get container status \"6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249\": rpc error: code = NotFound desc = could not find container \"6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249\": container with ID starting with 6b0fa0d8a8beaa1744f68649d37a88f78800d58e6555d0c03f25ad290cf5f249 not found: ID does not exist" Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.281190 4717 scope.go:117] "RemoveContainer" containerID="ea53c6b2c10dccdb41452736563b44b44469f1b382a6913cad333521f6cd0889" Feb 17 15:49:42 crc kubenswrapper[4717]: E0217 15:49:42.281473 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea53c6b2c10dccdb41452736563b44b44469f1b382a6913cad333521f6cd0889\": container with ID starting with ea53c6b2c10dccdb41452736563b44b44469f1b382a6913cad333521f6cd0889 not found: ID does not exist" containerID="ea53c6b2c10dccdb41452736563b44b44469f1b382a6913cad333521f6cd0889" Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.281519 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea53c6b2c10dccdb41452736563b44b44469f1b382a6913cad333521f6cd0889"} err="failed to get container status \"ea53c6b2c10dccdb41452736563b44b44469f1b382a6913cad333521f6cd0889\": rpc error: code = NotFound desc = could not find container \"ea53c6b2c10dccdb41452736563b44b44469f1b382a6913cad333521f6cd0889\": container with ID starting with ea53c6b2c10dccdb41452736563b44b44469f1b382a6913cad333521f6cd0889 not found: ID does not exist" Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.281556 4717 scope.go:117] "RemoveContainer" containerID="567d668ca551ff6a6f7aa76135f7e6cb746ed37d8f6ef12a873022d9a789fbbf" Feb 17 15:49:42 crc kubenswrapper[4717]: E0217 15:49:42.281875 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"567d668ca551ff6a6f7aa76135f7e6cb746ed37d8f6ef12a873022d9a789fbbf\": container with ID starting with 567d668ca551ff6a6f7aa76135f7e6cb746ed37d8f6ef12a873022d9a789fbbf not found: ID does not exist" containerID="567d668ca551ff6a6f7aa76135f7e6cb746ed37d8f6ef12a873022d9a789fbbf" Feb 17 15:49:42 crc kubenswrapper[4717]: I0217 15:49:42.281917 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"567d668ca551ff6a6f7aa76135f7e6cb746ed37d8f6ef12a873022d9a789fbbf"} err="failed to get container status \"567d668ca551ff6a6f7aa76135f7e6cb746ed37d8f6ef12a873022d9a789fbbf\": rpc error: code = NotFound desc = could not find container \"567d668ca551ff6a6f7aa76135f7e6cb746ed37d8f6ef12a873022d9a789fbbf\": container with ID starting with 567d668ca551ff6a6f7aa76135f7e6cb746ed37d8f6ef12a873022d9a789fbbf not found: ID does not exist" Feb 17 15:49:43 crc kubenswrapper[4717]: I0217 15:49:43.870948 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97724a8a-8be7-4d6f-99e1-d76dbfd82931" path="/var/lib/kubelet/pods/97724a8a-8be7-4d6f-99e1-d76dbfd82931/volumes" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.274383 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6p67t"] Feb 17 15:49:49 crc kubenswrapper[4717]: E0217 15:49:49.276038 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97724a8a-8be7-4d6f-99e1-d76dbfd82931" containerName="extract-utilities" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.276069 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="97724a8a-8be7-4d6f-99e1-d76dbfd82931" containerName="extract-utilities" Feb 17 15:49:49 crc kubenswrapper[4717]: E0217 15:49:49.276119 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97724a8a-8be7-4d6f-99e1-d76dbfd82931" containerName="extract-content" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.276128 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="97724a8a-8be7-4d6f-99e1-d76dbfd82931" containerName="extract-content" Feb 17 15:49:49 crc kubenswrapper[4717]: E0217 15:49:49.276168 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97724a8a-8be7-4d6f-99e1-d76dbfd82931" containerName="registry-server" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.276176 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="97724a8a-8be7-4d6f-99e1-d76dbfd82931" containerName="registry-server" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.276740 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="97724a8a-8be7-4d6f-99e1-d76dbfd82931" containerName="registry-server" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.316658 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6p67t"] Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.316795 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.414932 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6528361-4fe1-495c-8ee8-b23f327ce618-utilities\") pod \"community-operators-6p67t\" (UID: \"f6528361-4fe1-495c-8ee8-b23f327ce618\") " pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.415330 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7jn5\" (UniqueName: \"kubernetes.io/projected/f6528361-4fe1-495c-8ee8-b23f327ce618-kube-api-access-k7jn5\") pod \"community-operators-6p67t\" (UID: \"f6528361-4fe1-495c-8ee8-b23f327ce618\") " pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.415475 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6528361-4fe1-495c-8ee8-b23f327ce618-catalog-content\") pod \"community-operators-6p67t\" (UID: \"f6528361-4fe1-495c-8ee8-b23f327ce618\") " pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.517290 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6528361-4fe1-495c-8ee8-b23f327ce618-utilities\") pod \"community-operators-6p67t\" (UID: \"f6528361-4fe1-495c-8ee8-b23f327ce618\") " pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.517345 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7jn5\" (UniqueName: \"kubernetes.io/projected/f6528361-4fe1-495c-8ee8-b23f327ce618-kube-api-access-k7jn5\") pod \"community-operators-6p67t\" (UID: \"f6528361-4fe1-495c-8ee8-b23f327ce618\") " pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.517424 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6528361-4fe1-495c-8ee8-b23f327ce618-catalog-content\") pod \"community-operators-6p67t\" (UID: \"f6528361-4fe1-495c-8ee8-b23f327ce618\") " pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.517853 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6528361-4fe1-495c-8ee8-b23f327ce618-utilities\") pod \"community-operators-6p67t\" (UID: \"f6528361-4fe1-495c-8ee8-b23f327ce618\") " pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.517963 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6528361-4fe1-495c-8ee8-b23f327ce618-catalog-content\") pod \"community-operators-6p67t\" (UID: \"f6528361-4fe1-495c-8ee8-b23f327ce618\") " pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.542019 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7jn5\" (UniqueName: \"kubernetes.io/projected/f6528361-4fe1-495c-8ee8-b23f327ce618-kube-api-access-k7jn5\") pod \"community-operators-6p67t\" (UID: \"f6528361-4fe1-495c-8ee8-b23f327ce618\") " pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:49 crc kubenswrapper[4717]: I0217 15:49:49.654204 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:50 crc kubenswrapper[4717]: I0217 15:49:50.194846 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6p67t"] Feb 17 15:49:50 crc kubenswrapper[4717]: I0217 15:49:50.352864 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p67t" event={"ID":"f6528361-4fe1-495c-8ee8-b23f327ce618","Type":"ContainerStarted","Data":"27ddfa3ef2b6fadff48cb2f1f67b32aafed4d3036386e92d18a9177f3a0745cd"} Feb 17 15:49:51 crc kubenswrapper[4717]: I0217 15:49:51.375663 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6528361-4fe1-495c-8ee8-b23f327ce618" containerID="b7488e7ca66b0e6e6de3baf0a44a93e0f4af9803ccc27eb4951fe6b749f86e8a" exitCode=0 Feb 17 15:49:51 crc kubenswrapper[4717]: I0217 15:49:51.376322 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p67t" event={"ID":"f6528361-4fe1-495c-8ee8-b23f327ce618","Type":"ContainerDied","Data":"b7488e7ca66b0e6e6de3baf0a44a93e0f4af9803ccc27eb4951fe6b749f86e8a"} Feb 17 15:49:52 crc kubenswrapper[4717]: I0217 15:49:52.388148 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p67t" event={"ID":"f6528361-4fe1-495c-8ee8-b23f327ce618","Type":"ContainerStarted","Data":"f237d7544d1b5704c987ce6720436ccc54478d0ca92f132a9e7139937f694f55"} Feb 17 15:49:53 crc kubenswrapper[4717]: I0217 15:49:53.399247 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6528361-4fe1-495c-8ee8-b23f327ce618" containerID="f237d7544d1b5704c987ce6720436ccc54478d0ca92f132a9e7139937f694f55" exitCode=0 Feb 17 15:49:53 crc kubenswrapper[4717]: I0217 15:49:53.399417 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p67t" event={"ID":"f6528361-4fe1-495c-8ee8-b23f327ce618","Type":"ContainerDied","Data":"f237d7544d1b5704c987ce6720436ccc54478d0ca92f132a9e7139937f694f55"} Feb 17 15:49:54 crc kubenswrapper[4717]: I0217 15:49:54.408200 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p67t" event={"ID":"f6528361-4fe1-495c-8ee8-b23f327ce618","Type":"ContainerStarted","Data":"6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3"} Feb 17 15:49:54 crc kubenswrapper[4717]: I0217 15:49:54.430110 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6p67t" podStartSLOduration=3.044675705 podStartE2EDuration="5.430061804s" podCreationTimestamp="2026-02-17 15:49:49 +0000 UTC" firstStartedPulling="2026-02-17 15:49:51.394458068 +0000 UTC m=+3457.810298564" lastFinishedPulling="2026-02-17 15:49:53.779844177 +0000 UTC m=+3460.195684663" observedRunningTime="2026-02-17 15:49:54.424208729 +0000 UTC m=+3460.840049235" watchObservedRunningTime="2026-02-17 15:49:54.430061804 +0000 UTC m=+3460.845902300" Feb 17 15:49:59 crc kubenswrapper[4717]: I0217 15:49:59.654633 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:59 crc kubenswrapper[4717]: I0217 15:49:59.655417 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:49:59 crc kubenswrapper[4717]: I0217 15:49:59.710834 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:50:00 crc kubenswrapper[4717]: I0217 15:50:00.507985 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:50:00 crc kubenswrapper[4717]: I0217 15:50:00.564007 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6p67t"] Feb 17 15:50:02 crc kubenswrapper[4717]: I0217 15:50:02.486831 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6p67t" podUID="f6528361-4fe1-495c-8ee8-b23f327ce618" containerName="registry-server" containerID="cri-o://6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3" gracePeriod=2 Feb 17 15:50:02 crc kubenswrapper[4717]: I0217 15:50:02.970536 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.094993 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6528361-4fe1-495c-8ee8-b23f327ce618-catalog-content\") pod \"f6528361-4fe1-495c-8ee8-b23f327ce618\" (UID: \"f6528361-4fe1-495c-8ee8-b23f327ce618\") " Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.095258 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7jn5\" (UniqueName: \"kubernetes.io/projected/f6528361-4fe1-495c-8ee8-b23f327ce618-kube-api-access-k7jn5\") pod \"f6528361-4fe1-495c-8ee8-b23f327ce618\" (UID: \"f6528361-4fe1-495c-8ee8-b23f327ce618\") " Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.095325 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6528361-4fe1-495c-8ee8-b23f327ce618-utilities\") pod \"f6528361-4fe1-495c-8ee8-b23f327ce618\" (UID: \"f6528361-4fe1-495c-8ee8-b23f327ce618\") " Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.096136 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6528361-4fe1-495c-8ee8-b23f327ce618-utilities" (OuterVolumeSpecName: "utilities") pod "f6528361-4fe1-495c-8ee8-b23f327ce618" (UID: "f6528361-4fe1-495c-8ee8-b23f327ce618"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.100102 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6528361-4fe1-495c-8ee8-b23f327ce618-kube-api-access-k7jn5" (OuterVolumeSpecName: "kube-api-access-k7jn5") pod "f6528361-4fe1-495c-8ee8-b23f327ce618" (UID: "f6528361-4fe1-495c-8ee8-b23f327ce618"). InnerVolumeSpecName "kube-api-access-k7jn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.151327 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6528361-4fe1-495c-8ee8-b23f327ce618-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6528361-4fe1-495c-8ee8-b23f327ce618" (UID: "f6528361-4fe1-495c-8ee8-b23f327ce618"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.197034 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6528361-4fe1-495c-8ee8-b23f327ce618-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.197064 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7jn5\" (UniqueName: \"kubernetes.io/projected/f6528361-4fe1-495c-8ee8-b23f327ce618-kube-api-access-k7jn5\") on node \"crc\" DevicePath \"\"" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.197075 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6528361-4fe1-495c-8ee8-b23f327ce618-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.504982 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6528361-4fe1-495c-8ee8-b23f327ce618" containerID="6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3" exitCode=0 Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.505106 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6p67t" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.506456 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p67t" event={"ID":"f6528361-4fe1-495c-8ee8-b23f327ce618","Type":"ContainerDied","Data":"6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3"} Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.506658 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p67t" event={"ID":"f6528361-4fe1-495c-8ee8-b23f327ce618","Type":"ContainerDied","Data":"27ddfa3ef2b6fadff48cb2f1f67b32aafed4d3036386e92d18a9177f3a0745cd"} Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.506728 4717 scope.go:117] "RemoveContainer" containerID="6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.545512 4717 scope.go:117] "RemoveContainer" containerID="f237d7544d1b5704c987ce6720436ccc54478d0ca92f132a9e7139937f694f55" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.614969 4717 scope.go:117] "RemoveContainer" containerID="b7488e7ca66b0e6e6de3baf0a44a93e0f4af9803ccc27eb4951fe6b749f86e8a" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.671189 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6p67t"] Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.671803 4717 scope.go:117] "RemoveContainer" containerID="6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3" Feb 17 15:50:03 crc kubenswrapper[4717]: E0217 15:50:03.672316 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3\": container with ID starting with 6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3 not found: ID does not exist" containerID="6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.672376 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3"} err="failed to get container status \"6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3\": rpc error: code = NotFound desc = could not find container \"6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3\": container with ID starting with 6d5ad74aaef5f307e8dd15122e8dc15c88a305fd4cbcf4c34c00ac7002b9f9c3 not found: ID does not exist" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.672416 4717 scope.go:117] "RemoveContainer" containerID="f237d7544d1b5704c987ce6720436ccc54478d0ca92f132a9e7139937f694f55" Feb 17 15:50:03 crc kubenswrapper[4717]: E0217 15:50:03.673240 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f237d7544d1b5704c987ce6720436ccc54478d0ca92f132a9e7139937f694f55\": container with ID starting with f237d7544d1b5704c987ce6720436ccc54478d0ca92f132a9e7139937f694f55 not found: ID does not exist" containerID="f237d7544d1b5704c987ce6720436ccc54478d0ca92f132a9e7139937f694f55" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.673307 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f237d7544d1b5704c987ce6720436ccc54478d0ca92f132a9e7139937f694f55"} err="failed to get container status \"f237d7544d1b5704c987ce6720436ccc54478d0ca92f132a9e7139937f694f55\": rpc error: code = NotFound desc = could not find container \"f237d7544d1b5704c987ce6720436ccc54478d0ca92f132a9e7139937f694f55\": container with ID starting with f237d7544d1b5704c987ce6720436ccc54478d0ca92f132a9e7139937f694f55 not found: ID does not exist" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.673351 4717 scope.go:117] "RemoveContainer" containerID="b7488e7ca66b0e6e6de3baf0a44a93e0f4af9803ccc27eb4951fe6b749f86e8a" Feb 17 15:50:03 crc kubenswrapper[4717]: E0217 15:50:03.674600 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7488e7ca66b0e6e6de3baf0a44a93e0f4af9803ccc27eb4951fe6b749f86e8a\": container with ID starting with b7488e7ca66b0e6e6de3baf0a44a93e0f4af9803ccc27eb4951fe6b749f86e8a not found: ID does not exist" containerID="b7488e7ca66b0e6e6de3baf0a44a93e0f4af9803ccc27eb4951fe6b749f86e8a" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.674633 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7488e7ca66b0e6e6de3baf0a44a93e0f4af9803ccc27eb4951fe6b749f86e8a"} err="failed to get container status \"b7488e7ca66b0e6e6de3baf0a44a93e0f4af9803ccc27eb4951fe6b749f86e8a\": rpc error: code = NotFound desc = could not find container \"b7488e7ca66b0e6e6de3baf0a44a93e0f4af9803ccc27eb4951fe6b749f86e8a\": container with ID starting with b7488e7ca66b0e6e6de3baf0a44a93e0f4af9803ccc27eb4951fe6b749f86e8a not found: ID does not exist" Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.677725 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6p67t"] Feb 17 15:50:03 crc kubenswrapper[4717]: I0217 15:50:03.865177 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6528361-4fe1-495c-8ee8-b23f327ce618" path="/var/lib/kubelet/pods/f6528361-4fe1-495c-8ee8-b23f327ce618/volumes" Feb 17 15:50:04 crc kubenswrapper[4717]: I0217 15:50:04.516770 4717 generic.go:334] "Generic (PLEG): container finished" podID="740a901f-8b10-42b7-a747-8c405d4cc01f" containerID="928777e45c7ce3344fcf275b44467c9a5fe5e47f7ad319ea88637a94be76c3a8" exitCode=0 Feb 17 15:50:04 crc kubenswrapper[4717]: I0217 15:50:04.516836 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" event={"ID":"740a901f-8b10-42b7-a747-8c405d4cc01f","Type":"ContainerDied","Data":"928777e45c7ce3344fcf275b44467c9a5fe5e47f7ad319ea88637a94be76c3a8"} Feb 17 15:50:05 crc kubenswrapper[4717]: I0217 15:50:05.656968 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" Feb 17 15:50:05 crc kubenswrapper[4717]: I0217 15:50:05.700608 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dwwm7/crc-debug-gk6q2"] Feb 17 15:50:05 crc kubenswrapper[4717]: I0217 15:50:05.711878 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dwwm7/crc-debug-gk6q2"] Feb 17 15:50:05 crc kubenswrapper[4717]: I0217 15:50:05.745581 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nntcr\" (UniqueName: \"kubernetes.io/projected/740a901f-8b10-42b7-a747-8c405d4cc01f-kube-api-access-nntcr\") pod \"740a901f-8b10-42b7-a747-8c405d4cc01f\" (UID: \"740a901f-8b10-42b7-a747-8c405d4cc01f\") " Feb 17 15:50:05 crc kubenswrapper[4717]: I0217 15:50:05.745745 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/740a901f-8b10-42b7-a747-8c405d4cc01f-host\") pod \"740a901f-8b10-42b7-a747-8c405d4cc01f\" (UID: \"740a901f-8b10-42b7-a747-8c405d4cc01f\") " Feb 17 15:50:05 crc kubenswrapper[4717]: I0217 15:50:05.745815 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740a901f-8b10-42b7-a747-8c405d4cc01f-host" (OuterVolumeSpecName: "host") pod "740a901f-8b10-42b7-a747-8c405d4cc01f" (UID: "740a901f-8b10-42b7-a747-8c405d4cc01f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:50:05 crc kubenswrapper[4717]: I0217 15:50:05.746626 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/740a901f-8b10-42b7-a747-8c405d4cc01f-host\") on node \"crc\" DevicePath \"\"" Feb 17 15:50:05 crc kubenswrapper[4717]: I0217 15:50:05.760374 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740a901f-8b10-42b7-a747-8c405d4cc01f-kube-api-access-nntcr" (OuterVolumeSpecName: "kube-api-access-nntcr") pod "740a901f-8b10-42b7-a747-8c405d4cc01f" (UID: "740a901f-8b10-42b7-a747-8c405d4cc01f"). InnerVolumeSpecName "kube-api-access-nntcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:50:05 crc kubenswrapper[4717]: I0217 15:50:05.848404 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nntcr\" (UniqueName: \"kubernetes.io/projected/740a901f-8b10-42b7-a747-8c405d4cc01f-kube-api-access-nntcr\") on node \"crc\" DevicePath \"\"" Feb 17 15:50:05 crc kubenswrapper[4717]: I0217 15:50:05.864942 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740a901f-8b10-42b7-a747-8c405d4cc01f" path="/var/lib/kubelet/pods/740a901f-8b10-42b7-a747-8c405d4cc01f/volumes" Feb 17 15:50:06 crc kubenswrapper[4717]: I0217 15:50:06.556546 4717 scope.go:117] "RemoveContainer" containerID="928777e45c7ce3344fcf275b44467c9a5fe5e47f7ad319ea88637a94be76c3a8" Feb 17 15:50:06 crc kubenswrapper[4717]: I0217 15:50:06.556586 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/crc-debug-gk6q2" Feb 17 15:50:06 crc kubenswrapper[4717]: I0217 15:50:06.947681 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dwwm7/crc-debug-vmlld"] Feb 17 15:50:06 crc kubenswrapper[4717]: E0217 15:50:06.948681 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6528361-4fe1-495c-8ee8-b23f327ce618" containerName="registry-server" Feb 17 15:50:06 crc kubenswrapper[4717]: I0217 15:50:06.948704 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6528361-4fe1-495c-8ee8-b23f327ce618" containerName="registry-server" Feb 17 15:50:06 crc kubenswrapper[4717]: E0217 15:50:06.948747 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740a901f-8b10-42b7-a747-8c405d4cc01f" containerName="container-00" Feb 17 15:50:06 crc kubenswrapper[4717]: I0217 15:50:06.948760 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="740a901f-8b10-42b7-a747-8c405d4cc01f" containerName="container-00" Feb 17 15:50:06 crc kubenswrapper[4717]: E0217 15:50:06.948783 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6528361-4fe1-495c-8ee8-b23f327ce618" containerName="extract-content" Feb 17 15:50:06 crc kubenswrapper[4717]: I0217 15:50:06.948798 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6528361-4fe1-495c-8ee8-b23f327ce618" containerName="extract-content" Feb 17 15:50:06 crc kubenswrapper[4717]: E0217 15:50:06.948846 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6528361-4fe1-495c-8ee8-b23f327ce618" containerName="extract-utilities" Feb 17 15:50:06 crc kubenswrapper[4717]: I0217 15:50:06.948859 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6528361-4fe1-495c-8ee8-b23f327ce618" containerName="extract-utilities" Feb 17 15:50:06 crc kubenswrapper[4717]: I0217 15:50:06.949311 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6528361-4fe1-495c-8ee8-b23f327ce618" containerName="registry-server" Feb 17 15:50:06 crc kubenswrapper[4717]: I0217 15:50:06.949340 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="740a901f-8b10-42b7-a747-8c405d4cc01f" containerName="container-00" Feb 17 15:50:06 crc kubenswrapper[4717]: I0217 15:50:06.950647 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/crc-debug-vmlld" Feb 17 15:50:06 crc kubenswrapper[4717]: I0217 15:50:06.954213 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-dwwm7"/"default-dockercfg-gkfm5" Feb 17 15:50:07 crc kubenswrapper[4717]: I0217 15:50:07.079792 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcmvd\" (UniqueName: \"kubernetes.io/projected/db47598a-b85b-4d23-9a19-e2e0666cc752-kube-api-access-jcmvd\") pod \"crc-debug-vmlld\" (UID: \"db47598a-b85b-4d23-9a19-e2e0666cc752\") " pod="openshift-must-gather-dwwm7/crc-debug-vmlld" Feb 17 15:50:07 crc kubenswrapper[4717]: I0217 15:50:07.080201 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db47598a-b85b-4d23-9a19-e2e0666cc752-host\") pod \"crc-debug-vmlld\" (UID: \"db47598a-b85b-4d23-9a19-e2e0666cc752\") " pod="openshift-must-gather-dwwm7/crc-debug-vmlld" Feb 17 15:50:07 crc kubenswrapper[4717]: I0217 15:50:07.182939 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db47598a-b85b-4d23-9a19-e2e0666cc752-host\") pod \"crc-debug-vmlld\" (UID: \"db47598a-b85b-4d23-9a19-e2e0666cc752\") " pod="openshift-must-gather-dwwm7/crc-debug-vmlld" Feb 17 15:50:07 crc kubenswrapper[4717]: I0217 15:50:07.183107 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db47598a-b85b-4d23-9a19-e2e0666cc752-host\") pod \"crc-debug-vmlld\" (UID: \"db47598a-b85b-4d23-9a19-e2e0666cc752\") " pod="openshift-must-gather-dwwm7/crc-debug-vmlld" Feb 17 15:50:07 crc kubenswrapper[4717]: I0217 15:50:07.183141 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcmvd\" (UniqueName: \"kubernetes.io/projected/db47598a-b85b-4d23-9a19-e2e0666cc752-kube-api-access-jcmvd\") pod \"crc-debug-vmlld\" (UID: \"db47598a-b85b-4d23-9a19-e2e0666cc752\") " pod="openshift-must-gather-dwwm7/crc-debug-vmlld" Feb 17 15:50:07 crc kubenswrapper[4717]: I0217 15:50:07.223819 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcmvd\" (UniqueName: \"kubernetes.io/projected/db47598a-b85b-4d23-9a19-e2e0666cc752-kube-api-access-jcmvd\") pod \"crc-debug-vmlld\" (UID: \"db47598a-b85b-4d23-9a19-e2e0666cc752\") " pod="openshift-must-gather-dwwm7/crc-debug-vmlld" Feb 17 15:50:07 crc kubenswrapper[4717]: I0217 15:50:07.283590 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/crc-debug-vmlld" Feb 17 15:50:07 crc kubenswrapper[4717]: W0217 15:50:07.343247 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb47598a_b85b_4d23_9a19_e2e0666cc752.slice/crio-dbc8099dd76bc432ad332e84a35a188ca8fcb217029ea0f360144270c633141b WatchSource:0}: Error finding container dbc8099dd76bc432ad332e84a35a188ca8fcb217029ea0f360144270c633141b: Status 404 returned error can't find the container with id dbc8099dd76bc432ad332e84a35a188ca8fcb217029ea0f360144270c633141b Feb 17 15:50:07 crc kubenswrapper[4717]: I0217 15:50:07.578627 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dwwm7/crc-debug-vmlld" event={"ID":"db47598a-b85b-4d23-9a19-e2e0666cc752","Type":"ContainerStarted","Data":"dbc8099dd76bc432ad332e84a35a188ca8fcb217029ea0f360144270c633141b"} Feb 17 15:50:08 crc kubenswrapper[4717]: I0217 15:50:08.589062 4717 generic.go:334] "Generic (PLEG): container finished" podID="db47598a-b85b-4d23-9a19-e2e0666cc752" containerID="92ca07a42d289cf9a4b75aa004ddd7390e79d7df17e499b581eda64fa15920c5" exitCode=0 Feb 17 15:50:08 crc kubenswrapper[4717]: I0217 15:50:08.589146 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dwwm7/crc-debug-vmlld" event={"ID":"db47598a-b85b-4d23-9a19-e2e0666cc752","Type":"ContainerDied","Data":"92ca07a42d289cf9a4b75aa004ddd7390e79d7df17e499b581eda64fa15920c5"} Feb 17 15:50:09 crc kubenswrapper[4717]: I0217 15:50:09.124236 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dwwm7/crc-debug-vmlld"] Feb 17 15:50:09 crc kubenswrapper[4717]: I0217 15:50:09.133262 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dwwm7/crc-debug-vmlld"] Feb 17 15:50:09 crc kubenswrapper[4717]: I0217 15:50:09.711395 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/crc-debug-vmlld" Feb 17 15:50:09 crc kubenswrapper[4717]: I0217 15:50:09.839958 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db47598a-b85b-4d23-9a19-e2e0666cc752-host\") pod \"db47598a-b85b-4d23-9a19-e2e0666cc752\" (UID: \"db47598a-b85b-4d23-9a19-e2e0666cc752\") " Feb 17 15:50:09 crc kubenswrapper[4717]: I0217 15:50:09.840095 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db47598a-b85b-4d23-9a19-e2e0666cc752-host" (OuterVolumeSpecName: "host") pod "db47598a-b85b-4d23-9a19-e2e0666cc752" (UID: "db47598a-b85b-4d23-9a19-e2e0666cc752"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:50:09 crc kubenswrapper[4717]: I0217 15:50:09.840142 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcmvd\" (UniqueName: \"kubernetes.io/projected/db47598a-b85b-4d23-9a19-e2e0666cc752-kube-api-access-jcmvd\") pod \"db47598a-b85b-4d23-9a19-e2e0666cc752\" (UID: \"db47598a-b85b-4d23-9a19-e2e0666cc752\") " Feb 17 15:50:09 crc kubenswrapper[4717]: I0217 15:50:09.840511 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db47598a-b85b-4d23-9a19-e2e0666cc752-host\") on node \"crc\" DevicePath \"\"" Feb 17 15:50:09 crc kubenswrapper[4717]: I0217 15:50:09.845249 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db47598a-b85b-4d23-9a19-e2e0666cc752-kube-api-access-jcmvd" (OuterVolumeSpecName: "kube-api-access-jcmvd") pod "db47598a-b85b-4d23-9a19-e2e0666cc752" (UID: "db47598a-b85b-4d23-9a19-e2e0666cc752"). InnerVolumeSpecName "kube-api-access-jcmvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:50:09 crc kubenswrapper[4717]: I0217 15:50:09.856446 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db47598a-b85b-4d23-9a19-e2e0666cc752" path="/var/lib/kubelet/pods/db47598a-b85b-4d23-9a19-e2e0666cc752/volumes" Feb 17 15:50:09 crc kubenswrapper[4717]: I0217 15:50:09.943574 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcmvd\" (UniqueName: \"kubernetes.io/projected/db47598a-b85b-4d23-9a19-e2e0666cc752-kube-api-access-jcmvd\") on node \"crc\" DevicePath \"\"" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.340650 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dwwm7/crc-debug-cs7h2"] Feb 17 15:50:10 crc kubenswrapper[4717]: E0217 15:50:10.341922 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db47598a-b85b-4d23-9a19-e2e0666cc752" containerName="container-00" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.341979 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="db47598a-b85b-4d23-9a19-e2e0666cc752" containerName="container-00" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.342377 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="db47598a-b85b-4d23-9a19-e2e0666cc752" containerName="container-00" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.343468 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/crc-debug-cs7h2" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.451891 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8bl5\" (UniqueName: \"kubernetes.io/projected/1e21ded8-4590-4374-a8bb-a6097a1636d8-kube-api-access-l8bl5\") pod \"crc-debug-cs7h2\" (UID: \"1e21ded8-4590-4374-a8bb-a6097a1636d8\") " pod="openshift-must-gather-dwwm7/crc-debug-cs7h2" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.451990 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e21ded8-4590-4374-a8bb-a6097a1636d8-host\") pod \"crc-debug-cs7h2\" (UID: \"1e21ded8-4590-4374-a8bb-a6097a1636d8\") " pod="openshift-must-gather-dwwm7/crc-debug-cs7h2" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.554274 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8bl5\" (UniqueName: \"kubernetes.io/projected/1e21ded8-4590-4374-a8bb-a6097a1636d8-kube-api-access-l8bl5\") pod \"crc-debug-cs7h2\" (UID: \"1e21ded8-4590-4374-a8bb-a6097a1636d8\") " pod="openshift-must-gather-dwwm7/crc-debug-cs7h2" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.554432 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e21ded8-4590-4374-a8bb-a6097a1636d8-host\") pod \"crc-debug-cs7h2\" (UID: \"1e21ded8-4590-4374-a8bb-a6097a1636d8\") " pod="openshift-must-gather-dwwm7/crc-debug-cs7h2" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.554574 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e21ded8-4590-4374-a8bb-a6097a1636d8-host\") pod \"crc-debug-cs7h2\" (UID: \"1e21ded8-4590-4374-a8bb-a6097a1636d8\") " pod="openshift-must-gather-dwwm7/crc-debug-cs7h2" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.575918 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8bl5\" (UniqueName: \"kubernetes.io/projected/1e21ded8-4590-4374-a8bb-a6097a1636d8-kube-api-access-l8bl5\") pod \"crc-debug-cs7h2\" (UID: \"1e21ded8-4590-4374-a8bb-a6097a1636d8\") " pod="openshift-must-gather-dwwm7/crc-debug-cs7h2" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.611873 4717 scope.go:117] "RemoveContainer" containerID="92ca07a42d289cf9a4b75aa004ddd7390e79d7df17e499b581eda64fa15920c5" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.611899 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/crc-debug-vmlld" Feb 17 15:50:10 crc kubenswrapper[4717]: I0217 15:50:10.664216 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/crc-debug-cs7h2" Feb 17 15:50:10 crc kubenswrapper[4717]: W0217 15:50:10.692501 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e21ded8_4590_4374_a8bb_a6097a1636d8.slice/crio-66f48e94e1b16187878428f915a26309574936e99215f462c70b5996c1f96274 WatchSource:0}: Error finding container 66f48e94e1b16187878428f915a26309574936e99215f462c70b5996c1f96274: Status 404 returned error can't find the container with id 66f48e94e1b16187878428f915a26309574936e99215f462c70b5996c1f96274 Feb 17 15:50:11 crc kubenswrapper[4717]: I0217 15:50:11.624963 4717 generic.go:334] "Generic (PLEG): container finished" podID="1e21ded8-4590-4374-a8bb-a6097a1636d8" containerID="2cd499d300bc5f91222abc9ce3ec799328d088955a9b264718da00b3c76ac2c5" exitCode=0 Feb 17 15:50:11 crc kubenswrapper[4717]: I0217 15:50:11.625326 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dwwm7/crc-debug-cs7h2" event={"ID":"1e21ded8-4590-4374-a8bb-a6097a1636d8","Type":"ContainerDied","Data":"2cd499d300bc5f91222abc9ce3ec799328d088955a9b264718da00b3c76ac2c5"} Feb 17 15:50:11 crc kubenswrapper[4717]: I0217 15:50:11.625364 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dwwm7/crc-debug-cs7h2" event={"ID":"1e21ded8-4590-4374-a8bb-a6097a1636d8","Type":"ContainerStarted","Data":"66f48e94e1b16187878428f915a26309574936e99215f462c70b5996c1f96274"} Feb 17 15:50:11 crc kubenswrapper[4717]: I0217 15:50:11.666457 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dwwm7/crc-debug-cs7h2"] Feb 17 15:50:11 crc kubenswrapper[4717]: I0217 15:50:11.678564 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dwwm7/crc-debug-cs7h2"] Feb 17 15:50:12 crc kubenswrapper[4717]: I0217 15:50:12.748029 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/crc-debug-cs7h2" Feb 17 15:50:12 crc kubenswrapper[4717]: I0217 15:50:12.896763 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e21ded8-4590-4374-a8bb-a6097a1636d8-host\") pod \"1e21ded8-4590-4374-a8bb-a6097a1636d8\" (UID: \"1e21ded8-4590-4374-a8bb-a6097a1636d8\") " Feb 17 15:50:12 crc kubenswrapper[4717]: I0217 15:50:12.896838 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8bl5\" (UniqueName: \"kubernetes.io/projected/1e21ded8-4590-4374-a8bb-a6097a1636d8-kube-api-access-l8bl5\") pod \"1e21ded8-4590-4374-a8bb-a6097a1636d8\" (UID: \"1e21ded8-4590-4374-a8bb-a6097a1636d8\") " Feb 17 15:50:12 crc kubenswrapper[4717]: I0217 15:50:12.896899 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e21ded8-4590-4374-a8bb-a6097a1636d8-host" (OuterVolumeSpecName: "host") pod "1e21ded8-4590-4374-a8bb-a6097a1636d8" (UID: "1e21ded8-4590-4374-a8bb-a6097a1636d8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:50:12 crc kubenswrapper[4717]: I0217 15:50:12.897647 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e21ded8-4590-4374-a8bb-a6097a1636d8-host\") on node \"crc\" DevicePath \"\"" Feb 17 15:50:12 crc kubenswrapper[4717]: I0217 15:50:12.903311 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e21ded8-4590-4374-a8bb-a6097a1636d8-kube-api-access-l8bl5" (OuterVolumeSpecName: "kube-api-access-l8bl5") pod "1e21ded8-4590-4374-a8bb-a6097a1636d8" (UID: "1e21ded8-4590-4374-a8bb-a6097a1636d8"). InnerVolumeSpecName "kube-api-access-l8bl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:50:13 crc kubenswrapper[4717]: I0217 15:50:13.000046 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8bl5\" (UniqueName: \"kubernetes.io/projected/1e21ded8-4590-4374-a8bb-a6097a1636d8-kube-api-access-l8bl5\") on node \"crc\" DevicePath \"\"" Feb 17 15:50:13 crc kubenswrapper[4717]: I0217 15:50:13.647511 4717 scope.go:117] "RemoveContainer" containerID="2cd499d300bc5f91222abc9ce3ec799328d088955a9b264718da00b3c76ac2c5" Feb 17 15:50:13 crc kubenswrapper[4717]: I0217 15:50:13.647614 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/crc-debug-cs7h2" Feb 17 15:50:13 crc kubenswrapper[4717]: I0217 15:50:13.855098 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e21ded8-4590-4374-a8bb-a6097a1636d8" path="/var/lib/kubelet/pods/1e21ded8-4590-4374-a8bb-a6097a1636d8/volumes" Feb 17 15:50:30 crc kubenswrapper[4717]: I0217 15:50:30.435347 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-dc9f5cc6-vmvkc_24915ee8-ac0d-4c45-826f-6ca18351c1fd/barbican-api/0.log" Feb 17 15:50:30 crc kubenswrapper[4717]: I0217 15:50:30.609692 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-dc9f5cc6-vmvkc_24915ee8-ac0d-4c45-826f-6ca18351c1fd/barbican-api-log/0.log" Feb 17 15:50:30 crc kubenswrapper[4717]: I0217 15:50:30.650401 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-678d9d64d6-sj9md_492fb01b-1889-47ff-b334-750b9614cc60/barbican-keystone-listener/0.log" Feb 17 15:50:30 crc kubenswrapper[4717]: I0217 15:50:30.690207 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-678d9d64d6-sj9md_492fb01b-1889-47ff-b334-750b9614cc60/barbican-keystone-listener-log/0.log" Feb 17 15:50:30 crc kubenswrapper[4717]: I0217 15:50:30.798941 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-96ddcd589-gq7qg_0b3b8192-afd0-44c1-886f-7dc072112460/barbican-worker/0.log" Feb 17 15:50:30 crc kubenswrapper[4717]: I0217 15:50:30.886978 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-96ddcd589-gq7qg_0b3b8192-afd0-44c1-886f-7dc072112460/barbican-worker-log/0.log" Feb 17 15:50:31 crc kubenswrapper[4717]: I0217 15:50:31.045675 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk_fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:31 crc kubenswrapper[4717]: I0217 15:50:31.099279 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fa734ee-b468-4462-850d-9f347c991241/ceilometer-central-agent/0.log" Feb 17 15:50:31 crc kubenswrapper[4717]: I0217 15:50:31.126321 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fa734ee-b468-4462-850d-9f347c991241/ceilometer-notification-agent/0.log" Feb 17 15:50:31 crc kubenswrapper[4717]: I0217 15:50:31.220718 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fa734ee-b468-4462-850d-9f347c991241/proxy-httpd/0.log" Feb 17 15:50:31 crc kubenswrapper[4717]: I0217 15:50:31.268566 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fa734ee-b468-4462-850d-9f347c991241/sg-core/0.log" Feb 17 15:50:31 crc kubenswrapper[4717]: I0217 15:50:31.365202 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_880f22f8-f3a5-479a-b456-7afdd5e7d96e/cinder-api/0.log" Feb 17 15:50:31 crc kubenswrapper[4717]: I0217 15:50:31.416953 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_880f22f8-f3a5-479a-b456-7afdd5e7d96e/cinder-api-log/0.log" Feb 17 15:50:31 crc kubenswrapper[4717]: I0217 15:50:31.528879 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7eef2c95-16ed-4b57-a95a-aa5b302ec564/cinder-scheduler/0.log" Feb 17 15:50:31 crc kubenswrapper[4717]: I0217 15:50:31.584394 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7eef2c95-16ed-4b57-a95a-aa5b302ec564/probe/0.log" Feb 17 15:50:31 crc kubenswrapper[4717]: I0217 15:50:31.674005 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh_ce8505ef-a14d-4936-b9c2-4334b5cf69b1/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:31 crc kubenswrapper[4717]: I0217 15:50:31.782891 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4_d0c4edef-fee0-490a-8c25-9e4c9950c04f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:31 crc kubenswrapper[4717]: I0217 15:50:31.850022 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-dncpb_394b7b44-48ed-406e-be48-6f7cffabfaf9/init/0.log" Feb 17 15:50:32 crc kubenswrapper[4717]: I0217 15:50:32.020100 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-dncpb_394b7b44-48ed-406e-be48-6f7cffabfaf9/init/0.log" Feb 17 15:50:32 crc kubenswrapper[4717]: I0217 15:50:32.074535 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-xt86s_6f01b139-f61b-4935-930c-65756bd54cdc/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:32 crc kubenswrapper[4717]: I0217 15:50:32.096584 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-dncpb_394b7b44-48ed-406e-be48-6f7cffabfaf9/dnsmasq-dns/0.log" Feb 17 15:50:32 crc kubenswrapper[4717]: I0217 15:50:32.253372 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f/glance-httpd/0.log" Feb 17 15:50:32 crc kubenswrapper[4717]: I0217 15:50:32.291971 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f/glance-log/0.log" Feb 17 15:50:32 crc kubenswrapper[4717]: I0217 15:50:32.417333 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cb47f318-2779-4288-b0ce-775766436b6b/glance-log/0.log" Feb 17 15:50:32 crc kubenswrapper[4717]: I0217 15:50:32.440156 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cb47f318-2779-4288-b0ce-775766436b6b/glance-httpd/0.log" Feb 17 15:50:32 crc kubenswrapper[4717]: I0217 15:50:32.611073 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-85b46995b-rj5bq_04bb64de-6640-4f6a-9052-ff0edf9dacb8/horizon/0.log" Feb 17 15:50:32 crc kubenswrapper[4717]: I0217 15:50:32.746754 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk_8d553699-af50-41f7-b4da-1e0182788f60/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:32 crc kubenswrapper[4717]: I0217 15:50:32.884693 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mgr9p_7b728f97-9d73-433e-a910-13591303221e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:32 crc kubenswrapper[4717]: I0217 15:50:32.886708 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-85b46995b-rj5bq_04bb64de-6640-4f6a-9052-ff0edf9dacb8/horizon-log/0.log" Feb 17 15:50:33 crc kubenswrapper[4717]: I0217 15:50:33.104554 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_10b56503-0d1b-474b-8b7c-0d07f5eae27b/kube-state-metrics/0.log" Feb 17 15:50:33 crc kubenswrapper[4717]: I0217 15:50:33.149617 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-96f9cc575-vd9jv_c8ee178e-3086-48e3-ad91-aefa00e0d10e/keystone-api/0.log" Feb 17 15:50:33 crc kubenswrapper[4717]: I0217 15:50:33.312338 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ts54t_fff26fba-baaf-4ed4-9c5b-b6dec300d19c/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:33 crc kubenswrapper[4717]: I0217 15:50:33.618067 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b696c957f-jwh8l_d5fc5d05-07ba-476e-835e-61dfa2e9edc1/neutron-api/0.log" Feb 17 15:50:33 crc kubenswrapper[4717]: I0217 15:50:33.653668 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b696c957f-jwh8l_d5fc5d05-07ba-476e-835e-61dfa2e9edc1/neutron-httpd/0.log" Feb 17 15:50:33 crc kubenswrapper[4717]: I0217 15:50:33.846782 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb_e4ab62d3-2bec-418d-ad69-7d384f86652c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:34 crc kubenswrapper[4717]: I0217 15:50:34.366892 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a9a8f201-b55e-47e3-9d85-18d73631f9ae/nova-api-log/0.log" Feb 17 15:50:34 crc kubenswrapper[4717]: I0217 15:50:34.677787 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_18f24db6-956b-479c-88b5-283ae2b17f4d/nova-cell0-conductor-conductor/0.log" Feb 17 15:50:34 crc kubenswrapper[4717]: I0217 15:50:34.730162 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a9a8f201-b55e-47e3-9d85-18d73631f9ae/nova-api-api/0.log" Feb 17 15:50:34 crc kubenswrapper[4717]: I0217 15:50:34.819232 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_731c9760-bb30-4dd0-b246-0fb9ed312ae9/nova-cell1-conductor-conductor/0.log" Feb 17 15:50:35 crc kubenswrapper[4717]: I0217 15:50:35.335585 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-sftg9_377efa15-97db-4618-85fd-1185cefde9a7/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:35 crc kubenswrapper[4717]: I0217 15:50:35.344855 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_90d942b5-ae77-4210-b456-ca573622fc06/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 15:50:35 crc kubenswrapper[4717]: I0217 15:50:35.488962 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f3c5ad1f-898b-4643-80a5-6946068bf842/nova-metadata-log/0.log" Feb 17 15:50:35 crc kubenswrapper[4717]: I0217 15:50:35.702363 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a119d602-10c8-4b7b-aa61-77774c7f024f/nova-scheduler-scheduler/0.log" Feb 17 15:50:35 crc kubenswrapper[4717]: I0217 15:50:35.824963 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_14b7d39d-8183-4e96-a163-b72323ccb0b5/mysql-bootstrap/0.log" Feb 17 15:50:35 crc kubenswrapper[4717]: I0217 15:50:35.941614 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_14b7d39d-8183-4e96-a163-b72323ccb0b5/mysql-bootstrap/0.log" Feb 17 15:50:35 crc kubenswrapper[4717]: I0217 15:50:35.976286 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_14b7d39d-8183-4e96-a163-b72323ccb0b5/galera/0.log" Feb 17 15:50:36 crc kubenswrapper[4717]: I0217 15:50:36.174205 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8ec72b87-16f5-487e-ae08-a52b5d289bee/mysql-bootstrap/0.log" Feb 17 15:50:36 crc kubenswrapper[4717]: I0217 15:50:36.304964 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8ec72b87-16f5-487e-ae08-a52b5d289bee/mysql-bootstrap/0.log" Feb 17 15:50:36 crc kubenswrapper[4717]: I0217 15:50:36.343884 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8ec72b87-16f5-487e-ae08-a52b5d289bee/galera/0.log" Feb 17 15:50:36 crc kubenswrapper[4717]: I0217 15:50:36.468954 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8cf42885-3509-4779-901c-e88f11c5fdfd/openstackclient/0.log" Feb 17 15:50:36 crc kubenswrapper[4717]: I0217 15:50:36.539740 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f3c5ad1f-898b-4643-80a5-6946068bf842/nova-metadata-metadata/0.log" Feb 17 15:50:36 crc kubenswrapper[4717]: I0217 15:50:36.658912 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-k4zc7_edd6eb53-55b7-4a61-867c-e4bf277af963/ovn-controller/0.log" Feb 17 15:50:36 crc kubenswrapper[4717]: I0217 15:50:36.725215 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xcmtm_e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74/openstack-network-exporter/0.log" Feb 17 15:50:36 crc kubenswrapper[4717]: I0217 15:50:36.855036 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5r2q4_1954ed93-1aa6-4c08-8379-01d047f5da20/ovsdb-server-init/0.log" Feb 17 15:50:36 crc kubenswrapper[4717]: I0217 15:50:36.991375 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5r2q4_1954ed93-1aa6-4c08-8379-01d047f5da20/ovsdb-server-init/0.log" Feb 17 15:50:37 crc kubenswrapper[4717]: I0217 15:50:37.058901 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5r2q4_1954ed93-1aa6-4c08-8379-01d047f5da20/ovs-vswitchd/0.log" Feb 17 15:50:37 crc kubenswrapper[4717]: I0217 15:50:37.078583 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5r2q4_1954ed93-1aa6-4c08-8379-01d047f5da20/ovsdb-server/0.log" Feb 17 15:50:37 crc kubenswrapper[4717]: I0217 15:50:37.281970 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-mvpm4_f366c26c-4b32-488e-8738-dbbf0ddd3adc/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:37 crc kubenswrapper[4717]: I0217 15:50:37.290314 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a56b9da3-623b-44df-861c-62c9b45566db/ovn-northd/0.log" Feb 17 15:50:37 crc kubenswrapper[4717]: I0217 15:50:37.310416 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a56b9da3-623b-44df-861c-62c9b45566db/openstack-network-exporter/0.log" Feb 17 15:50:37 crc kubenswrapper[4717]: I0217 15:50:37.454708 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7a71caa4-7d53-466e-8d74-98c814d3afda/openstack-network-exporter/0.log" Feb 17 15:50:37 crc kubenswrapper[4717]: I0217 15:50:37.517188 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7a71caa4-7d53-466e-8d74-98c814d3afda/ovsdbserver-nb/0.log" Feb 17 15:50:37 crc kubenswrapper[4717]: I0217 15:50:37.633074 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4/openstack-network-exporter/0.log" Feb 17 15:50:37 crc kubenswrapper[4717]: I0217 15:50:37.715355 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4/ovsdbserver-sb/0.log" Feb 17 15:50:37 crc kubenswrapper[4717]: I0217 15:50:37.844542 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7fb4dbdc46-npdpn_c1670724-02ac-4de2-b161-c2b78ecb9bb0/placement-api/0.log" Feb 17 15:50:37 crc kubenswrapper[4717]: I0217 15:50:37.929434 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7fb4dbdc46-npdpn_c1670724-02ac-4de2-b161-c2b78ecb9bb0/placement-log/0.log" Feb 17 15:50:38 crc kubenswrapper[4717]: I0217 15:50:38.045689 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b/setup-container/0.log" Feb 17 15:50:38 crc kubenswrapper[4717]: I0217 15:50:38.198746 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b/setup-container/0.log" Feb 17 15:50:38 crc kubenswrapper[4717]: I0217 15:50:38.240303 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b/rabbitmq/0.log" Feb 17 15:50:38 crc kubenswrapper[4717]: I0217 15:50:38.316500 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7/setup-container/0.log" Feb 17 15:50:38 crc kubenswrapper[4717]: I0217 15:50:38.449819 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7/setup-container/0.log" Feb 17 15:50:38 crc kubenswrapper[4717]: I0217 15:50:38.477984 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7/rabbitmq/0.log" Feb 17 15:50:38 crc kubenswrapper[4717]: I0217 15:50:38.554760 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp_dcbb0902-ee67-4df1-b420-f299e4400354/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:38 crc kubenswrapper[4717]: I0217 15:50:38.719632 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-678sx_2047a6dc-1a1c-4b26-bee6-c16d812a99df/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:38 crc kubenswrapper[4717]: I0217 15:50:38.770021 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm_de72738c-0584-4539-9bd9-92382a0f5538/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:38 crc kubenswrapper[4717]: I0217 15:50:38.939757 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-dbw5h_c60befe9-ac76-4ba1-9cd0-15154e3c4e7a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.032960 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-6rtxk_205b31b1-1e73-466f-9ede-0248217b4356/ssh-known-hosts-edpm-deployment/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.241809 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-874f74f55-n92h5_8206d35f-44b8-45f5-9286-8e4179701b96/proxy-server/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.298911 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-874f74f55-n92h5_8206d35f-44b8-45f5-9286-8e4179701b96/proxy-httpd/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.377100 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-rhzj6_929feb4c-fd82-4293-9a44-a6f53816cdae/swift-ring-rebalance/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.540500 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/account-auditor/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.599711 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/account-reaper/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.623721 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/account-replicator/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.741979 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/account-server/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.755107 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/container-auditor/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.878759 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/container-replicator/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.922821 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/container-server/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.938763 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/container-updater/0.log" Feb 17 15:50:39 crc kubenswrapper[4717]: I0217 15:50:39.987675 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/object-auditor/0.log" Feb 17 15:50:40 crc kubenswrapper[4717]: I0217 15:50:40.079489 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/object-expirer/0.log" Feb 17 15:50:40 crc kubenswrapper[4717]: I0217 15:50:40.156361 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/object-replicator/0.log" Feb 17 15:50:40 crc kubenswrapper[4717]: I0217 15:50:40.177338 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/object-server/0.log" Feb 17 15:50:40 crc kubenswrapper[4717]: I0217 15:50:40.267285 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/object-updater/0.log" Feb 17 15:50:40 crc kubenswrapper[4717]: I0217 15:50:40.399129 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/rsync/0.log" Feb 17 15:50:40 crc kubenswrapper[4717]: I0217 15:50:40.442797 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/swift-recon-cron/0.log" Feb 17 15:50:40 crc kubenswrapper[4717]: I0217 15:50:40.597261 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv_92941f8a-938e-41b2-ae92-ea490e7050d9/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:40 crc kubenswrapper[4717]: I0217 15:50:40.672696 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_6bc8b07d-6032-43fa-821d-fa1685427d56/tempest-tests-tempest-tests-runner/0.log" Feb 17 15:50:40 crc kubenswrapper[4717]: I0217 15:50:40.778631 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2f75b776-5a46-4241-adb5-eb50dbc8ba0f/test-operator-logs-container/0.log" Feb 17 15:50:40 crc kubenswrapper[4717]: I0217 15:50:40.913914 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-frrhf_6dfba30c-cf0e-4165-bc37-8284cc15b50f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:50:48 crc kubenswrapper[4717]: I0217 15:50:48.967900 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f/memcached/0.log" Feb 17 15:51:06 crc kubenswrapper[4717]: I0217 15:51:06.194933 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/util/0.log" Feb 17 15:51:06 crc kubenswrapper[4717]: I0217 15:51:06.399725 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/util/0.log" Feb 17 15:51:06 crc kubenswrapper[4717]: I0217 15:51:06.443594 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/pull/0.log" Feb 17 15:51:06 crc kubenswrapper[4717]: I0217 15:51:06.478697 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/pull/0.log" Feb 17 15:51:06 crc kubenswrapper[4717]: I0217 15:51:06.625963 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/extract/0.log" Feb 17 15:51:06 crc kubenswrapper[4717]: I0217 15:51:06.643115 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/pull/0.log" Feb 17 15:51:06 crc kubenswrapper[4717]: I0217 15:51:06.644815 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/util/0.log" Feb 17 15:51:07 crc kubenswrapper[4717]: I0217 15:51:07.020074 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-5ddqf_55e71e0a-9623-4049-b828-77040b5dd36e/manager/0.log" Feb 17 15:51:07 crc kubenswrapper[4717]: I0217 15:51:07.348437 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-qdg4j_0004ea51-4233-47ad-a9d9-8e5d745a55f8/manager/0.log" Feb 17 15:51:07 crc kubenswrapper[4717]: I0217 15:51:07.535746 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-5d768_c12ec16a-c8fd-48ae-8c86-257bcef97050/manager/0.log" Feb 17 15:51:07 crc kubenswrapper[4717]: I0217 15:51:07.747782 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-9kp5d_9bc8e53b-549b-48d9-810f-25ce640b7339/manager/0.log" Feb 17 15:51:08 crc kubenswrapper[4717]: I0217 15:51:08.094011 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-5rhm9_aee98c41-a3b5-43ba-b272-279e5836df0b/manager/0.log" Feb 17 15:51:08 crc kubenswrapper[4717]: I0217 15:51:08.230005 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-lcth2_5390fc8f-fcef-4a64-8d3c-4ba8e9b2f86d/manager/0.log" Feb 17 15:51:08 crc kubenswrapper[4717]: I0217 15:51:08.306110 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-d6ssr_9aa57429-09b8-4262-8356-fd8ea486b236/manager/0.log" Feb 17 15:51:08 crc kubenswrapper[4717]: I0217 15:51:08.506140 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-d7sx6_6edcc364-656f-4f2d-aa9a-3409b3b58471/manager/0.log" Feb 17 15:51:08 crc kubenswrapper[4717]: I0217 15:51:08.574798 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-dtxn5_514fed4f-53b1-4b52-8b25-7e4ec648e155/manager/0.log" Feb 17 15:51:08 crc kubenswrapper[4717]: I0217 15:51:08.798312 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-s6xjq_563dc2c2-73a1-485a-ab9e-6f7e0b3423cb/manager/0.log" Feb 17 15:51:09 crc kubenswrapper[4717]: I0217 15:51:09.064705 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-gmfn2_22accf0b-8a4d-478a-bc45-d5bd4aa45b87/manager/0.log" Feb 17 15:51:09 crc kubenswrapper[4717]: I0217 15:51:09.294438 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-22tfq_3a97433f-8ce2-446e-92ef-170a4996ffe8/manager/0.log" Feb 17 15:51:09 crc kubenswrapper[4717]: I0217 15:51:09.500764 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cm8796_b4295f2b-c9b6-4604-b910-525c07cca2ed/manager/0.log" Feb 17 15:51:09 crc kubenswrapper[4717]: I0217 15:51:09.949739 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7948dfdc59-ljckc_ece1adcd-6e4d-4cf5-afd4-d108db8df6d6/operator/0.log" Feb 17 15:51:10 crc kubenswrapper[4717]: I0217 15:51:10.431505 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jpq9r_e69d57ba-48b1-48d9-b658-c0a86cf05ab4/registry-server/0.log" Feb 17 15:51:10 crc kubenswrapper[4717]: I0217 15:51:10.656387 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-x6nmz_8a4392b6-6232-4135-91f9-676c565446fc/manager/0.log" Feb 17 15:51:10 crc kubenswrapper[4717]: I0217 15:51:10.726845 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-mspl4_3dcfb2d1-e765-4eb0-8300-f7567a34cae7/manager/0.log" Feb 17 15:51:10 crc kubenswrapper[4717]: I0217 15:51:10.795761 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-hzzpc_ae48b931-1ac6-43ed-a407-b3fbb3d56178/manager/0.log" Feb 17 15:51:10 crc kubenswrapper[4717]: I0217 15:51:10.945445 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-2w74d_dd3497e6-6de6-4bdf-a23e-16adc21de6ab/operator/0.log" Feb 17 15:51:11 crc kubenswrapper[4717]: I0217 15:51:11.072456 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-j9tsj_2872debe-d42d-4955-bcca-5006aa7a2ecc/manager/0.log" Feb 17 15:51:11 crc kubenswrapper[4717]: I0217 15:51:11.278850 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-spbft_af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb/manager/0.log" Feb 17 15:51:11 crc kubenswrapper[4717]: I0217 15:51:11.370550 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-zlkwq_aaf337fd-63ab-42de-a465-02fecc40116b/manager/0.log" Feb 17 15:51:11 crc kubenswrapper[4717]: I0217 15:51:11.505010 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-jq8zs_5fdc2829-32d7-456a-99f3-e15b957b272e/manager/0.log" Feb 17 15:51:11 crc kubenswrapper[4717]: I0217 15:51:11.846182 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7d87bc949d-9mr4h_f2279bdf-746c-4e8c-8703-74a256bd7923/manager/0.log" Feb 17 15:51:13 crc kubenswrapper[4717]: I0217 15:51:13.234825 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-fp8nh_cd977519-10c4-4afe-8f51-52f6cab597f9/manager/0.log" Feb 17 15:51:31 crc kubenswrapper[4717]: I0217 15:51:31.479236 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-vxkqh_f576665e-17f5-4704-bd20-5debf9fb8612/control-plane-machine-set-operator/0.log" Feb 17 15:51:31 crc kubenswrapper[4717]: I0217 15:51:31.637871 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9t64v_ba04adcf-e228-4486-a74a-e9846bdaa53f/kube-rbac-proxy/0.log" Feb 17 15:51:31 crc kubenswrapper[4717]: I0217 15:51:31.699581 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9t64v_ba04adcf-e228-4486-a74a-e9846bdaa53f/machine-api-operator/0.log" Feb 17 15:51:45 crc kubenswrapper[4717]: I0217 15:51:45.292469 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-kfbxb_448f5725-7b22-4ceb-8c7f-f1e8883d423b/cert-manager-controller/0.log" Feb 17 15:51:45 crc kubenswrapper[4717]: I0217 15:51:45.507209 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n2cmk_52e5e91c-8239-4ffb-bc9a-a5dc8780f6e3/cert-manager-cainjector/0.log" Feb 17 15:51:45 crc kubenswrapper[4717]: I0217 15:51:45.514296 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-n82c5_04796f6f-6755-4e30-9fec-7003006dd113/cert-manager-webhook/0.log" Feb 17 15:51:50 crc kubenswrapper[4717]: I0217 15:51:50.809148 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:51:50 crc kubenswrapper[4717]: I0217 15:51:50.809998 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:51:52 crc kubenswrapper[4717]: I0217 15:51:52.956617 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zgmlr"] Feb 17 15:51:52 crc kubenswrapper[4717]: E0217 15:51:52.957761 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e21ded8-4590-4374-a8bb-a6097a1636d8" containerName="container-00" Feb 17 15:51:52 crc kubenswrapper[4717]: I0217 15:51:52.957784 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e21ded8-4590-4374-a8bb-a6097a1636d8" containerName="container-00" Feb 17 15:51:52 crc kubenswrapper[4717]: I0217 15:51:52.958196 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e21ded8-4590-4374-a8bb-a6097a1636d8" containerName="container-00" Feb 17 15:51:52 crc kubenswrapper[4717]: I0217 15:51:52.960584 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:51:52 crc kubenswrapper[4717]: I0217 15:51:52.979436 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zgmlr"] Feb 17 15:51:53 crc kubenswrapper[4717]: I0217 15:51:53.108810 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7740552-837c-4e09-b53c-98ae5304ccd4-utilities\") pod \"certified-operators-zgmlr\" (UID: \"c7740552-837c-4e09-b53c-98ae5304ccd4\") " pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:51:53 crc kubenswrapper[4717]: I0217 15:51:53.108907 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7740552-837c-4e09-b53c-98ae5304ccd4-catalog-content\") pod \"certified-operators-zgmlr\" (UID: \"c7740552-837c-4e09-b53c-98ae5304ccd4\") " pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:51:53 crc kubenswrapper[4717]: I0217 15:51:53.108985 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb7db\" (UniqueName: \"kubernetes.io/projected/c7740552-837c-4e09-b53c-98ae5304ccd4-kube-api-access-lb7db\") pod \"certified-operators-zgmlr\" (UID: \"c7740552-837c-4e09-b53c-98ae5304ccd4\") " pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:51:53 crc kubenswrapper[4717]: I0217 15:51:53.210543 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7740552-837c-4e09-b53c-98ae5304ccd4-catalog-content\") pod \"certified-operators-zgmlr\" (UID: \"c7740552-837c-4e09-b53c-98ae5304ccd4\") " pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:51:53 crc kubenswrapper[4717]: I0217 15:51:53.210709 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb7db\" (UniqueName: \"kubernetes.io/projected/c7740552-837c-4e09-b53c-98ae5304ccd4-kube-api-access-lb7db\") pod \"certified-operators-zgmlr\" (UID: \"c7740552-837c-4e09-b53c-98ae5304ccd4\") " pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:51:53 crc kubenswrapper[4717]: I0217 15:51:53.210884 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7740552-837c-4e09-b53c-98ae5304ccd4-utilities\") pod \"certified-operators-zgmlr\" (UID: \"c7740552-837c-4e09-b53c-98ae5304ccd4\") " pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:51:53 crc kubenswrapper[4717]: I0217 15:51:53.211437 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7740552-837c-4e09-b53c-98ae5304ccd4-catalog-content\") pod \"certified-operators-zgmlr\" (UID: \"c7740552-837c-4e09-b53c-98ae5304ccd4\") " pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:51:53 crc kubenswrapper[4717]: I0217 15:51:53.211620 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7740552-837c-4e09-b53c-98ae5304ccd4-utilities\") pod \"certified-operators-zgmlr\" (UID: \"c7740552-837c-4e09-b53c-98ae5304ccd4\") " pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:51:53 crc kubenswrapper[4717]: I0217 15:51:53.232855 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb7db\" (UniqueName: \"kubernetes.io/projected/c7740552-837c-4e09-b53c-98ae5304ccd4-kube-api-access-lb7db\") pod \"certified-operators-zgmlr\" (UID: \"c7740552-837c-4e09-b53c-98ae5304ccd4\") " pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:51:53 crc kubenswrapper[4717]: I0217 15:51:53.293691 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:51:53 crc kubenswrapper[4717]: I0217 15:51:53.830981 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zgmlr"] Feb 17 15:51:54 crc kubenswrapper[4717]: I0217 15:51:54.622906 4717 generic.go:334] "Generic (PLEG): container finished" podID="c7740552-837c-4e09-b53c-98ae5304ccd4" containerID="cc265a6ee5b09305974400688d6e90a6d91d9b1d568ef9b362307ec226f14a41" exitCode=0 Feb 17 15:51:54 crc kubenswrapper[4717]: I0217 15:51:54.622985 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgmlr" event={"ID":"c7740552-837c-4e09-b53c-98ae5304ccd4","Type":"ContainerDied","Data":"cc265a6ee5b09305974400688d6e90a6d91d9b1d568ef9b362307ec226f14a41"} Feb 17 15:51:54 crc kubenswrapper[4717]: I0217 15:51:54.623071 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgmlr" event={"ID":"c7740552-837c-4e09-b53c-98ae5304ccd4","Type":"ContainerStarted","Data":"82894901452cc1c060f650e9ab080d4a80d99656fc6d440b1d7f72d02e33bb09"} Feb 17 15:51:55 crc kubenswrapper[4717]: I0217 15:51:55.638447 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgmlr" event={"ID":"c7740552-837c-4e09-b53c-98ae5304ccd4","Type":"ContainerStarted","Data":"c92f3541375150117c48f15f53c5d14b13a03ec815c34de5828a68c42a6982c3"} Feb 17 15:51:56 crc kubenswrapper[4717]: I0217 15:51:56.651779 4717 generic.go:334] "Generic (PLEG): container finished" podID="c7740552-837c-4e09-b53c-98ae5304ccd4" containerID="c92f3541375150117c48f15f53c5d14b13a03ec815c34de5828a68c42a6982c3" exitCode=0 Feb 17 15:51:56 crc kubenswrapper[4717]: I0217 15:51:56.651863 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgmlr" event={"ID":"c7740552-837c-4e09-b53c-98ae5304ccd4","Type":"ContainerDied","Data":"c92f3541375150117c48f15f53c5d14b13a03ec815c34de5828a68c42a6982c3"} Feb 17 15:51:57 crc kubenswrapper[4717]: I0217 15:51:57.668829 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgmlr" event={"ID":"c7740552-837c-4e09-b53c-98ae5304ccd4","Type":"ContainerStarted","Data":"af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888"} Feb 17 15:51:57 crc kubenswrapper[4717]: I0217 15:51:57.707891 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zgmlr" podStartSLOduration=3.303116219 podStartE2EDuration="5.707864245s" podCreationTimestamp="2026-02-17 15:51:52 +0000 UTC" firstStartedPulling="2026-02-17 15:51:54.627255269 +0000 UTC m=+3581.043095745" lastFinishedPulling="2026-02-17 15:51:57.032003295 +0000 UTC m=+3583.447843771" observedRunningTime="2026-02-17 15:51:57.693224343 +0000 UTC m=+3584.109064859" watchObservedRunningTime="2026-02-17 15:51:57.707864245 +0000 UTC m=+3584.123704751" Feb 17 15:52:00 crc kubenswrapper[4717]: I0217 15:52:00.629538 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-7rlqn_303af50d-ffd5-4a3b-9020-b96bdcddc135/nmstate-console-plugin/0.log" Feb 17 15:52:00 crc kubenswrapper[4717]: I0217 15:52:00.839860 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-r6698_28ecae2c-05f9-43e9-ad1a-581b4d6e8fea/kube-rbac-proxy/0.log" Feb 17 15:52:00 crc kubenswrapper[4717]: I0217 15:52:00.841153 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-kcx4j_6c630162-18cc-4f15-80f4-2b1ad7a2e87c/nmstate-handler/0.log" Feb 17 15:52:00 crc kubenswrapper[4717]: I0217 15:52:00.849942 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-r6698_28ecae2c-05f9-43e9-ad1a-581b4d6e8fea/nmstate-metrics/0.log" Feb 17 15:52:01 crc kubenswrapper[4717]: I0217 15:52:01.004970 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-xgp5x_05310076-5092-4692-a1a2-69826306ea88/nmstate-operator/0.log" Feb 17 15:52:01 crc kubenswrapper[4717]: I0217 15:52:01.082029 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-pxrw7_005457ad-adeb-49ab-aaf7-6043dc2b6021/nmstate-webhook/0.log" Feb 17 15:52:03 crc kubenswrapper[4717]: I0217 15:52:03.296219 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:52:03 crc kubenswrapper[4717]: I0217 15:52:03.296619 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:52:03 crc kubenswrapper[4717]: I0217 15:52:03.355407 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:52:03 crc kubenswrapper[4717]: I0217 15:52:03.802997 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:52:03 crc kubenswrapper[4717]: I0217 15:52:03.873522 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zgmlr"] Feb 17 15:52:05 crc kubenswrapper[4717]: I0217 15:52:05.751987 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zgmlr" podUID="c7740552-837c-4e09-b53c-98ae5304ccd4" containerName="registry-server" containerID="cri-o://af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888" gracePeriod=2 Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.281239 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.384097 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7740552-837c-4e09-b53c-98ae5304ccd4-utilities\") pod \"c7740552-837c-4e09-b53c-98ae5304ccd4\" (UID: \"c7740552-837c-4e09-b53c-98ae5304ccd4\") " Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.384152 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7740552-837c-4e09-b53c-98ae5304ccd4-catalog-content\") pod \"c7740552-837c-4e09-b53c-98ae5304ccd4\" (UID: \"c7740552-837c-4e09-b53c-98ae5304ccd4\") " Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.384401 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb7db\" (UniqueName: \"kubernetes.io/projected/c7740552-837c-4e09-b53c-98ae5304ccd4-kube-api-access-lb7db\") pod \"c7740552-837c-4e09-b53c-98ae5304ccd4\" (UID: \"c7740552-837c-4e09-b53c-98ae5304ccd4\") " Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.388938 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7740552-837c-4e09-b53c-98ae5304ccd4-utilities" (OuterVolumeSpecName: "utilities") pod "c7740552-837c-4e09-b53c-98ae5304ccd4" (UID: "c7740552-837c-4e09-b53c-98ae5304ccd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.394700 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7740552-837c-4e09-b53c-98ae5304ccd4-kube-api-access-lb7db" (OuterVolumeSpecName: "kube-api-access-lb7db") pod "c7740552-837c-4e09-b53c-98ae5304ccd4" (UID: "c7740552-837c-4e09-b53c-98ae5304ccd4"). InnerVolumeSpecName "kube-api-access-lb7db". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.439742 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7740552-837c-4e09-b53c-98ae5304ccd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7740552-837c-4e09-b53c-98ae5304ccd4" (UID: "c7740552-837c-4e09-b53c-98ae5304ccd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.490422 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7740552-837c-4e09-b53c-98ae5304ccd4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.490462 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7740552-837c-4e09-b53c-98ae5304ccd4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.490504 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb7db\" (UniqueName: \"kubernetes.io/projected/c7740552-837c-4e09-b53c-98ae5304ccd4-kube-api-access-lb7db\") on node \"crc\" DevicePath \"\"" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.765204 4717 generic.go:334] "Generic (PLEG): container finished" podID="c7740552-837c-4e09-b53c-98ae5304ccd4" containerID="af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888" exitCode=0 Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.765271 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgmlr" event={"ID":"c7740552-837c-4e09-b53c-98ae5304ccd4","Type":"ContainerDied","Data":"af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888"} Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.765324 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgmlr" event={"ID":"c7740552-837c-4e09-b53c-98ae5304ccd4","Type":"ContainerDied","Data":"82894901452cc1c060f650e9ab080d4a80d99656fc6d440b1d7f72d02e33bb09"} Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.765343 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgmlr" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.765350 4717 scope.go:117] "RemoveContainer" containerID="af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.803560 4717 scope.go:117] "RemoveContainer" containerID="c92f3541375150117c48f15f53c5d14b13a03ec815c34de5828a68c42a6982c3" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.812688 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zgmlr"] Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.835619 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zgmlr"] Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.842260 4717 scope.go:117] "RemoveContainer" containerID="cc265a6ee5b09305974400688d6e90a6d91d9b1d568ef9b362307ec226f14a41" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.899032 4717 scope.go:117] "RemoveContainer" containerID="af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888" Feb 17 15:52:06 crc kubenswrapper[4717]: E0217 15:52:06.901041 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888\": container with ID starting with af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888 not found: ID does not exist" containerID="af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.901104 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888"} err="failed to get container status \"af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888\": rpc error: code = NotFound desc = could not find container \"af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888\": container with ID starting with af3742387f0b2a69c73b1332c1bba499b3929c05dc9e13ecbe8fd10d18aa5888 not found: ID does not exist" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.901131 4717 scope.go:117] "RemoveContainer" containerID="c92f3541375150117c48f15f53c5d14b13a03ec815c34de5828a68c42a6982c3" Feb 17 15:52:06 crc kubenswrapper[4717]: E0217 15:52:06.901428 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c92f3541375150117c48f15f53c5d14b13a03ec815c34de5828a68c42a6982c3\": container with ID starting with c92f3541375150117c48f15f53c5d14b13a03ec815c34de5828a68c42a6982c3 not found: ID does not exist" containerID="c92f3541375150117c48f15f53c5d14b13a03ec815c34de5828a68c42a6982c3" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.901453 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c92f3541375150117c48f15f53c5d14b13a03ec815c34de5828a68c42a6982c3"} err="failed to get container status \"c92f3541375150117c48f15f53c5d14b13a03ec815c34de5828a68c42a6982c3\": rpc error: code = NotFound desc = could not find container \"c92f3541375150117c48f15f53c5d14b13a03ec815c34de5828a68c42a6982c3\": container with ID starting with c92f3541375150117c48f15f53c5d14b13a03ec815c34de5828a68c42a6982c3 not found: ID does not exist" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.901470 4717 scope.go:117] "RemoveContainer" containerID="cc265a6ee5b09305974400688d6e90a6d91d9b1d568ef9b362307ec226f14a41" Feb 17 15:52:06 crc kubenswrapper[4717]: E0217 15:52:06.902812 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc265a6ee5b09305974400688d6e90a6d91d9b1d568ef9b362307ec226f14a41\": container with ID starting with cc265a6ee5b09305974400688d6e90a6d91d9b1d568ef9b362307ec226f14a41 not found: ID does not exist" containerID="cc265a6ee5b09305974400688d6e90a6d91d9b1d568ef9b362307ec226f14a41" Feb 17 15:52:06 crc kubenswrapper[4717]: I0217 15:52:06.902838 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc265a6ee5b09305974400688d6e90a6d91d9b1d568ef9b362307ec226f14a41"} err="failed to get container status \"cc265a6ee5b09305974400688d6e90a6d91d9b1d568ef9b362307ec226f14a41\": rpc error: code = NotFound desc = could not find container \"cc265a6ee5b09305974400688d6e90a6d91d9b1d568ef9b362307ec226f14a41\": container with ID starting with cc265a6ee5b09305974400688d6e90a6d91d9b1d568ef9b362307ec226f14a41 not found: ID does not exist" Feb 17 15:52:07 crc kubenswrapper[4717]: I0217 15:52:07.858367 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7740552-837c-4e09-b53c-98ae5304ccd4" path="/var/lib/kubelet/pods/c7740552-837c-4e09-b53c-98ae5304ccd4/volumes" Feb 17 15:52:20 crc kubenswrapper[4717]: I0217 15:52:20.808339 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:52:20 crc kubenswrapper[4717]: I0217 15:52:20.809040 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:52:30 crc kubenswrapper[4717]: I0217 15:52:30.025424 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7k7sc_72e0c267-99f7-4d36-83e8-219560a63667/controller/0.log" Feb 17 15:52:30 crc kubenswrapper[4717]: I0217 15:52:30.134855 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7k7sc_72e0c267-99f7-4d36-83e8-219560a63667/kube-rbac-proxy/0.log" Feb 17 15:52:30 crc kubenswrapper[4717]: I0217 15:52:30.307304 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-frr-files/0.log" Feb 17 15:52:30 crc kubenswrapper[4717]: I0217 15:52:30.444074 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-metrics/0.log" Feb 17 15:52:30 crc kubenswrapper[4717]: I0217 15:52:30.458512 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-frr-files/0.log" Feb 17 15:52:30 crc kubenswrapper[4717]: I0217 15:52:30.496537 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-reloader/0.log" Feb 17 15:52:30 crc kubenswrapper[4717]: I0217 15:52:30.498918 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-reloader/0.log" Feb 17 15:52:30 crc kubenswrapper[4717]: I0217 15:52:30.716481 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-metrics/0.log" Feb 17 15:52:30 crc kubenswrapper[4717]: I0217 15:52:30.736220 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-reloader/0.log" Feb 17 15:52:30 crc kubenswrapper[4717]: I0217 15:52:30.766981 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-frr-files/0.log" Feb 17 15:52:30 crc kubenswrapper[4717]: I0217 15:52:30.802799 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-metrics/0.log" Feb 17 15:52:30 crc kubenswrapper[4717]: I0217 15:52:30.967928 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-reloader/0.log" Feb 17 15:52:31 crc kubenswrapper[4717]: I0217 15:52:31.000655 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-metrics/0.log" Feb 17 15:52:31 crc kubenswrapper[4717]: I0217 15:52:31.005801 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/controller/0.log" Feb 17 15:52:31 crc kubenswrapper[4717]: I0217 15:52:31.043212 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-frr-files/0.log" Feb 17 15:52:31 crc kubenswrapper[4717]: I0217 15:52:31.154848 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/frr-metrics/0.log" Feb 17 15:52:31 crc kubenswrapper[4717]: I0217 15:52:31.218114 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/kube-rbac-proxy/0.log" Feb 17 15:52:31 crc kubenswrapper[4717]: I0217 15:52:31.262184 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/kube-rbac-proxy-frr/0.log" Feb 17 15:52:31 crc kubenswrapper[4717]: I0217 15:52:31.395480 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/reloader/0.log" Feb 17 15:52:31 crc kubenswrapper[4717]: I0217 15:52:31.478406 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-8f7bl_acf9f85d-e0ff-419b-8b56-ceda8ffeb28a/frr-k8s-webhook-server/0.log" Feb 17 15:52:31 crc kubenswrapper[4717]: I0217 15:52:31.728254 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-774c978687-jj472_656da445-df5b-402b-8347-70aa45a92159/manager/0.log" Feb 17 15:52:31 crc kubenswrapper[4717]: I0217 15:52:31.817495 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-56fd4fb65-pddnn_5b9d38f5-133a-43ed-bfef-8e5a27fa200c/webhook-server/0.log" Feb 17 15:52:31 crc kubenswrapper[4717]: I0217 15:52:31.977989 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vc7q2_0179ef66-9a6c-440d-8606-f2f040fe7b44/kube-rbac-proxy/0.log" Feb 17 15:52:32 crc kubenswrapper[4717]: I0217 15:52:32.434396 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vc7q2_0179ef66-9a6c-440d-8606-f2f040fe7b44/speaker/0.log" Feb 17 15:52:32 crc kubenswrapper[4717]: I0217 15:52:32.549715 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/frr/0.log" Feb 17 15:52:46 crc kubenswrapper[4717]: I0217 15:52:46.436175 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/util/0.log" Feb 17 15:52:46 crc kubenswrapper[4717]: I0217 15:52:46.664794 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/util/0.log" Feb 17 15:52:46 crc kubenswrapper[4717]: I0217 15:52:46.680474 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/pull/0.log" Feb 17 15:52:46 crc kubenswrapper[4717]: I0217 15:52:46.722694 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/pull/0.log" Feb 17 15:52:46 crc kubenswrapper[4717]: I0217 15:52:46.962453 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/util/0.log" Feb 17 15:52:47 crc kubenswrapper[4717]: I0217 15:52:47.046133 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/extract/0.log" Feb 17 15:52:47 crc kubenswrapper[4717]: I0217 15:52:47.072198 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/pull/0.log" Feb 17 15:52:47 crc kubenswrapper[4717]: I0217 15:52:47.176560 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/extract-utilities/0.log" Feb 17 15:52:47 crc kubenswrapper[4717]: I0217 15:52:47.331349 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/extract-content/0.log" Feb 17 15:52:47 crc kubenswrapper[4717]: I0217 15:52:47.346094 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/extract-utilities/0.log" Feb 17 15:52:47 crc kubenswrapper[4717]: I0217 15:52:47.356338 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/extract-content/0.log" Feb 17 15:52:47 crc kubenswrapper[4717]: I0217 15:52:47.535226 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/extract-content/0.log" Feb 17 15:52:47 crc kubenswrapper[4717]: I0217 15:52:47.579013 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/extract-utilities/0.log" Feb 17 15:52:47 crc kubenswrapper[4717]: I0217 15:52:47.796414 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/extract-utilities/0.log" Feb 17 15:52:47 crc kubenswrapper[4717]: I0217 15:52:47.949964 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/extract-content/0.log" Feb 17 15:52:47 crc kubenswrapper[4717]: I0217 15:52:47.992967 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/extract-utilities/0.log" Feb 17 15:52:48 crc kubenswrapper[4717]: I0217 15:52:48.042647 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/extract-content/0.log" Feb 17 15:52:48 crc kubenswrapper[4717]: I0217 15:52:48.189356 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/registry-server/0.log" Feb 17 15:52:48 crc kubenswrapper[4717]: I0217 15:52:48.225686 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/extract-content/0.log" Feb 17 15:52:48 crc kubenswrapper[4717]: I0217 15:52:48.237039 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/extract-utilities/0.log" Feb 17 15:52:48 crc kubenswrapper[4717]: I0217 15:52:48.431440 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/util/0.log" Feb 17 15:52:48 crc kubenswrapper[4717]: I0217 15:52:48.634694 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/util/0.log" Feb 17 15:52:48 crc kubenswrapper[4717]: I0217 15:52:48.653052 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/pull/0.log" Feb 17 15:52:48 crc kubenswrapper[4717]: I0217 15:52:48.694478 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/pull/0.log" Feb 17 15:52:48 crc kubenswrapper[4717]: I0217 15:52:48.784858 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/registry-server/0.log" Feb 17 15:52:48 crc kubenswrapper[4717]: I0217 15:52:48.869042 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/util/0.log" Feb 17 15:52:48 crc kubenswrapper[4717]: I0217 15:52:48.885485 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/pull/0.log" Feb 17 15:52:48 crc kubenswrapper[4717]: I0217 15:52:48.898156 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/extract/0.log" Feb 17 15:52:49 crc kubenswrapper[4717]: I0217 15:52:49.058919 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-njsvg_813c7436-6b2f-45ed-8fc8-d400f00c80fd/marketplace-operator/0.log" Feb 17 15:52:49 crc kubenswrapper[4717]: I0217 15:52:49.086655 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/extract-utilities/0.log" Feb 17 15:52:49 crc kubenswrapper[4717]: I0217 15:52:49.274526 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/extract-utilities/0.log" Feb 17 15:52:49 crc kubenswrapper[4717]: I0217 15:52:49.307830 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/extract-content/0.log" Feb 17 15:52:49 crc kubenswrapper[4717]: I0217 15:52:49.311382 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/extract-content/0.log" Feb 17 15:52:49 crc kubenswrapper[4717]: I0217 15:52:49.468562 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/extract-content/0.log" Feb 17 15:52:49 crc kubenswrapper[4717]: I0217 15:52:49.480564 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/extract-utilities/0.log" Feb 17 15:52:49 crc kubenswrapper[4717]: I0217 15:52:49.595112 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/registry-server/0.log" Feb 17 15:52:49 crc kubenswrapper[4717]: I0217 15:52:49.632105 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/extract-utilities/0.log" Feb 17 15:52:49 crc kubenswrapper[4717]: I0217 15:52:49.841877 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/extract-utilities/0.log" Feb 17 15:52:49 crc kubenswrapper[4717]: I0217 15:52:49.850122 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/extract-content/0.log" Feb 17 15:52:49 crc kubenswrapper[4717]: I0217 15:52:49.868106 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/extract-content/0.log" Feb 17 15:52:50 crc kubenswrapper[4717]: I0217 15:52:50.021999 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/extract-utilities/0.log" Feb 17 15:52:50 crc kubenswrapper[4717]: I0217 15:52:50.037155 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/extract-content/0.log" Feb 17 15:52:50 crc kubenswrapper[4717]: I0217 15:52:50.193847 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/registry-server/0.log" Feb 17 15:52:50 crc kubenswrapper[4717]: I0217 15:52:50.808007 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:52:50 crc kubenswrapper[4717]: I0217 15:52:50.808123 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:52:50 crc kubenswrapper[4717]: I0217 15:52:50.808190 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 15:52:50 crc kubenswrapper[4717]: I0217 15:52:50.809196 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5358db44d6f61af277bafa36114e49aed7d2fde0271825a0b49956b39c5a8b5a"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:52:50 crc kubenswrapper[4717]: I0217 15:52:50.809367 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://5358db44d6f61af277bafa36114e49aed7d2fde0271825a0b49956b39c5a8b5a" gracePeriod=600 Feb 17 15:52:51 crc kubenswrapper[4717]: I0217 15:52:51.213605 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="5358db44d6f61af277bafa36114e49aed7d2fde0271825a0b49956b39c5a8b5a" exitCode=0 Feb 17 15:52:51 crc kubenswrapper[4717]: I0217 15:52:51.213641 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"5358db44d6f61af277bafa36114e49aed7d2fde0271825a0b49956b39c5a8b5a"} Feb 17 15:52:51 crc kubenswrapper[4717]: I0217 15:52:51.214021 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9"} Feb 17 15:52:51 crc kubenswrapper[4717]: I0217 15:52:51.214046 4717 scope.go:117] "RemoveContainer" containerID="7f6fdc56a910ff4b08c2a858dfe3da3df945fd40336bef5032117ea81b440bbd" Feb 17 15:54:35 crc kubenswrapper[4717]: I0217 15:54:35.359846 4717 generic.go:334] "Generic (PLEG): container finished" podID="76d22ade-8acb-4a71-a950-880de86ec228" containerID="f39a570357654eb2ef5125f5c68fb0c322352e63b91ba71edb7b6d258ea0cbb0" exitCode=0 Feb 17 15:54:35 crc kubenswrapper[4717]: I0217 15:54:35.359958 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dwwm7/must-gather-ftttb" event={"ID":"76d22ade-8acb-4a71-a950-880de86ec228","Type":"ContainerDied","Data":"f39a570357654eb2ef5125f5c68fb0c322352e63b91ba71edb7b6d258ea0cbb0"} Feb 17 15:54:35 crc kubenswrapper[4717]: I0217 15:54:35.362300 4717 scope.go:117] "RemoveContainer" containerID="f39a570357654eb2ef5125f5c68fb0c322352e63b91ba71edb7b6d258ea0cbb0" Feb 17 15:54:35 crc kubenswrapper[4717]: I0217 15:54:35.785419 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dwwm7_must-gather-ftttb_76d22ade-8acb-4a71-a950-880de86ec228/gather/0.log" Feb 17 15:54:43 crc kubenswrapper[4717]: I0217 15:54:43.944887 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dwwm7/must-gather-ftttb"] Feb 17 15:54:43 crc kubenswrapper[4717]: I0217 15:54:43.945580 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-dwwm7/must-gather-ftttb" podUID="76d22ade-8acb-4a71-a950-880de86ec228" containerName="copy" containerID="cri-o://af11359abd1d087183410a62eef3e53589a6f7c8e6330eeeef37487156ec6c22" gracePeriod=2 Feb 17 15:54:44 crc kubenswrapper[4717]: I0217 15:54:44.003301 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dwwm7/must-gather-ftttb"] Feb 17 15:54:44 crc kubenswrapper[4717]: I0217 15:54:44.439465 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dwwm7_must-gather-ftttb_76d22ade-8acb-4a71-a950-880de86ec228/copy/0.log" Feb 17 15:54:44 crc kubenswrapper[4717]: I0217 15:54:44.440361 4717 generic.go:334] "Generic (PLEG): container finished" podID="76d22ade-8acb-4a71-a950-880de86ec228" containerID="af11359abd1d087183410a62eef3e53589a6f7c8e6330eeeef37487156ec6c22" exitCode=143 Feb 17 15:54:44 crc kubenswrapper[4717]: I0217 15:54:44.440416 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d36f6d449c2f582e070d088c9bb421684266174f1c27c4fc6e4dee1431d77600" Feb 17 15:54:44 crc kubenswrapper[4717]: I0217 15:54:44.459152 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dwwm7_must-gather-ftttb_76d22ade-8acb-4a71-a950-880de86ec228/copy/0.log" Feb 17 15:54:44 crc kubenswrapper[4717]: I0217 15:54:44.459843 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/must-gather-ftttb" Feb 17 15:54:44 crc kubenswrapper[4717]: I0217 15:54:44.539040 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76d22ade-8acb-4a71-a950-880de86ec228-must-gather-output\") pod \"76d22ade-8acb-4a71-a950-880de86ec228\" (UID: \"76d22ade-8acb-4a71-a950-880de86ec228\") " Feb 17 15:54:44 crc kubenswrapper[4717]: I0217 15:54:44.539309 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlxns\" (UniqueName: \"kubernetes.io/projected/76d22ade-8acb-4a71-a950-880de86ec228-kube-api-access-rlxns\") pod \"76d22ade-8acb-4a71-a950-880de86ec228\" (UID: \"76d22ade-8acb-4a71-a950-880de86ec228\") " Feb 17 15:54:44 crc kubenswrapper[4717]: I0217 15:54:44.549311 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d22ade-8acb-4a71-a950-880de86ec228-kube-api-access-rlxns" (OuterVolumeSpecName: "kube-api-access-rlxns") pod "76d22ade-8acb-4a71-a950-880de86ec228" (UID: "76d22ade-8acb-4a71-a950-880de86ec228"). InnerVolumeSpecName "kube-api-access-rlxns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:54:44 crc kubenswrapper[4717]: I0217 15:54:44.641792 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlxns\" (UniqueName: \"kubernetes.io/projected/76d22ade-8acb-4a71-a950-880de86ec228-kube-api-access-rlxns\") on node \"crc\" DevicePath \"\"" Feb 17 15:54:44 crc kubenswrapper[4717]: I0217 15:54:44.737621 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d22ade-8acb-4a71-a950-880de86ec228-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "76d22ade-8acb-4a71-a950-880de86ec228" (UID: "76d22ade-8acb-4a71-a950-880de86ec228"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:54:44 crc kubenswrapper[4717]: I0217 15:54:44.745872 4717 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76d22ade-8acb-4a71-a950-880de86ec228-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 15:54:45 crc kubenswrapper[4717]: I0217 15:54:45.452596 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dwwm7/must-gather-ftttb" Feb 17 15:54:45 crc kubenswrapper[4717]: I0217 15:54:45.861354 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d22ade-8acb-4a71-a950-880de86ec228" path="/var/lib/kubelet/pods/76d22ade-8acb-4a71-a950-880de86ec228/volumes" Feb 17 15:55:12 crc kubenswrapper[4717]: I0217 15:55:12.976756 4717 scope.go:117] "RemoveContainer" containerID="f39a570357654eb2ef5125f5c68fb0c322352e63b91ba71edb7b6d258ea0cbb0" Feb 17 15:55:13 crc kubenswrapper[4717]: I0217 15:55:13.073227 4717 scope.go:117] "RemoveContainer" containerID="af11359abd1d087183410a62eef3e53589a6f7c8e6330eeeef37487156ec6c22" Feb 17 15:55:20 crc kubenswrapper[4717]: I0217 15:55:20.808808 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:55:20 crc kubenswrapper[4717]: I0217 15:55:20.809542 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:55:50 crc kubenswrapper[4717]: I0217 15:55:50.808452 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:55:50 crc kubenswrapper[4717]: I0217 15:55:50.809210 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:56:20 crc kubenswrapper[4717]: I0217 15:56:20.809217 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:56:20 crc kubenswrapper[4717]: I0217 15:56:20.809847 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:56:20 crc kubenswrapper[4717]: I0217 15:56:20.809915 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 15:56:20 crc kubenswrapper[4717]: I0217 15:56:20.810979 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:56:20 crc kubenswrapper[4717]: I0217 15:56:20.811082 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" gracePeriod=600 Feb 17 15:56:20 crc kubenswrapper[4717]: E0217 15:56:20.944802 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:56:21 crc kubenswrapper[4717]: I0217 15:56:21.525262 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" exitCode=0 Feb 17 15:56:21 crc kubenswrapper[4717]: I0217 15:56:21.525342 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9"} Feb 17 15:56:21 crc kubenswrapper[4717]: I0217 15:56:21.525724 4717 scope.go:117] "RemoveContainer" containerID="5358db44d6f61af277bafa36114e49aed7d2fde0271825a0b49956b39c5a8b5a" Feb 17 15:56:21 crc kubenswrapper[4717]: I0217 15:56:21.526365 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:56:21 crc kubenswrapper[4717]: E0217 15:56:21.526681 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:56:36 crc kubenswrapper[4717]: I0217 15:56:36.847320 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:56:36 crc kubenswrapper[4717]: E0217 15:56:36.848590 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:56:51 crc kubenswrapper[4717]: I0217 15:56:51.846919 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:56:51 crc kubenswrapper[4717]: E0217 15:56:51.847886 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:57:06 crc kubenswrapper[4717]: I0217 15:57:06.111327 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:57:06 crc kubenswrapper[4717]: E0217 15:57:06.113717 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:57:19 crc kubenswrapper[4717]: I0217 15:57:19.846798 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:57:19 crc kubenswrapper[4717]: E0217 15:57:19.847563 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:57:34 crc kubenswrapper[4717]: I0217 15:57:34.847617 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:57:34 crc kubenswrapper[4717]: E0217 15:57:34.848687 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.365811 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hs9kl/must-gather-88x6k"] Feb 17 15:57:46 crc kubenswrapper[4717]: E0217 15:57:46.366680 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7740552-837c-4e09-b53c-98ae5304ccd4" containerName="extract-content" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.366693 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7740552-837c-4e09-b53c-98ae5304ccd4" containerName="extract-content" Feb 17 15:57:46 crc kubenswrapper[4717]: E0217 15:57:46.366722 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d22ade-8acb-4a71-a950-880de86ec228" containerName="gather" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.366728 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d22ade-8acb-4a71-a950-880de86ec228" containerName="gather" Feb 17 15:57:46 crc kubenswrapper[4717]: E0217 15:57:46.366750 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d22ade-8acb-4a71-a950-880de86ec228" containerName="copy" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.366757 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d22ade-8acb-4a71-a950-880de86ec228" containerName="copy" Feb 17 15:57:46 crc kubenswrapper[4717]: E0217 15:57:46.366766 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7740552-837c-4e09-b53c-98ae5304ccd4" containerName="extract-utilities" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.366773 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7740552-837c-4e09-b53c-98ae5304ccd4" containerName="extract-utilities" Feb 17 15:57:46 crc kubenswrapper[4717]: E0217 15:57:46.366783 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7740552-837c-4e09-b53c-98ae5304ccd4" containerName="registry-server" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.366789 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7740552-837c-4e09-b53c-98ae5304ccd4" containerName="registry-server" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.366933 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d22ade-8acb-4a71-a950-880de86ec228" containerName="copy" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.366954 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d22ade-8acb-4a71-a950-880de86ec228" containerName="gather" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.366970 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7740552-837c-4e09-b53c-98ae5304ccd4" containerName="registry-server" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.367900 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/must-gather-88x6k" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.370347 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hs9kl"/"openshift-service-ca.crt" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.370525 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hs9kl"/"kube-root-ca.crt" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.371469 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hs9kl"/"default-dockercfg-4m226" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.374198 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hs9kl/must-gather-88x6k"] Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.424622 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/80cd448a-eadd-444f-a811-03e863a95efb-must-gather-output\") pod \"must-gather-88x6k\" (UID: \"80cd448a-eadd-444f-a811-03e863a95efb\") " pod="openshift-must-gather-hs9kl/must-gather-88x6k" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.424934 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9g2q\" (UniqueName: \"kubernetes.io/projected/80cd448a-eadd-444f-a811-03e863a95efb-kube-api-access-k9g2q\") pod \"must-gather-88x6k\" (UID: \"80cd448a-eadd-444f-a811-03e863a95efb\") " pod="openshift-must-gather-hs9kl/must-gather-88x6k" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.526362 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9g2q\" (UniqueName: \"kubernetes.io/projected/80cd448a-eadd-444f-a811-03e863a95efb-kube-api-access-k9g2q\") pod \"must-gather-88x6k\" (UID: \"80cd448a-eadd-444f-a811-03e863a95efb\") " pod="openshift-must-gather-hs9kl/must-gather-88x6k" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.526514 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/80cd448a-eadd-444f-a811-03e863a95efb-must-gather-output\") pod \"must-gather-88x6k\" (UID: \"80cd448a-eadd-444f-a811-03e863a95efb\") " pod="openshift-must-gather-hs9kl/must-gather-88x6k" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.526947 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/80cd448a-eadd-444f-a811-03e863a95efb-must-gather-output\") pod \"must-gather-88x6k\" (UID: \"80cd448a-eadd-444f-a811-03e863a95efb\") " pod="openshift-must-gather-hs9kl/must-gather-88x6k" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.554674 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9g2q\" (UniqueName: \"kubernetes.io/projected/80cd448a-eadd-444f-a811-03e863a95efb-kube-api-access-k9g2q\") pod \"must-gather-88x6k\" (UID: \"80cd448a-eadd-444f-a811-03e863a95efb\") " pod="openshift-must-gather-hs9kl/must-gather-88x6k" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.689735 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/must-gather-88x6k" Feb 17 15:57:46 crc kubenswrapper[4717]: I0217 15:57:46.847536 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:57:46 crc kubenswrapper[4717]: E0217 15:57:46.848397 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:57:47 crc kubenswrapper[4717]: I0217 15:57:47.188634 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hs9kl/must-gather-88x6k"] Feb 17 15:57:47 crc kubenswrapper[4717]: I0217 15:57:47.570330 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hs9kl/must-gather-88x6k" event={"ID":"80cd448a-eadd-444f-a811-03e863a95efb","Type":"ContainerStarted","Data":"7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424"} Feb 17 15:57:47 crc kubenswrapper[4717]: I0217 15:57:47.570617 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hs9kl/must-gather-88x6k" event={"ID":"80cd448a-eadd-444f-a811-03e863a95efb","Type":"ContainerStarted","Data":"938b461d3937925f28ae13049792ac7e685433207cc8329610d41aebf2e42b9e"} Feb 17 15:57:49 crc kubenswrapper[4717]: I0217 15:57:49.607899 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hs9kl/must-gather-88x6k" event={"ID":"80cd448a-eadd-444f-a811-03e863a95efb","Type":"ContainerStarted","Data":"63dd57bf74f9676080f3c66148fb9a2b71617934d1b4c4ba47d8f7a4087b0f23"} Feb 17 15:57:49 crc kubenswrapper[4717]: I0217 15:57:49.635152 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hs9kl/must-gather-88x6k" podStartSLOduration=3.635124956 podStartE2EDuration="3.635124956s" podCreationTimestamp="2026-02-17 15:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:57:49.6327811 +0000 UTC m=+3936.048621626" watchObservedRunningTime="2026-02-17 15:57:49.635124956 +0000 UTC m=+3936.050965462" Feb 17 15:57:52 crc kubenswrapper[4717]: I0217 15:57:52.358480 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hs9kl/crc-debug-d4hh4"] Feb 17 15:57:52 crc kubenswrapper[4717]: I0217 15:57:52.359956 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" Feb 17 15:57:52 crc kubenswrapper[4717]: I0217 15:57:52.420812 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27l8h\" (UniqueName: \"kubernetes.io/projected/a9562ac3-88f2-46d4-9fc1-2abf553fed5c-kube-api-access-27l8h\") pod \"crc-debug-d4hh4\" (UID: \"a9562ac3-88f2-46d4-9fc1-2abf553fed5c\") " pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" Feb 17 15:57:52 crc kubenswrapper[4717]: I0217 15:57:52.420956 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9562ac3-88f2-46d4-9fc1-2abf553fed5c-host\") pod \"crc-debug-d4hh4\" (UID: \"a9562ac3-88f2-46d4-9fc1-2abf553fed5c\") " pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" Feb 17 15:57:52 crc kubenswrapper[4717]: I0217 15:57:52.522653 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27l8h\" (UniqueName: \"kubernetes.io/projected/a9562ac3-88f2-46d4-9fc1-2abf553fed5c-kube-api-access-27l8h\") pod \"crc-debug-d4hh4\" (UID: \"a9562ac3-88f2-46d4-9fc1-2abf553fed5c\") " pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" Feb 17 15:57:52 crc kubenswrapper[4717]: I0217 15:57:52.522796 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9562ac3-88f2-46d4-9fc1-2abf553fed5c-host\") pod \"crc-debug-d4hh4\" (UID: \"a9562ac3-88f2-46d4-9fc1-2abf553fed5c\") " pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" Feb 17 15:57:52 crc kubenswrapper[4717]: I0217 15:57:52.522965 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9562ac3-88f2-46d4-9fc1-2abf553fed5c-host\") pod \"crc-debug-d4hh4\" (UID: \"a9562ac3-88f2-46d4-9fc1-2abf553fed5c\") " pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" Feb 17 15:57:52 crc kubenswrapper[4717]: I0217 15:57:52.542798 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27l8h\" (UniqueName: \"kubernetes.io/projected/a9562ac3-88f2-46d4-9fc1-2abf553fed5c-kube-api-access-27l8h\") pod \"crc-debug-d4hh4\" (UID: \"a9562ac3-88f2-46d4-9fc1-2abf553fed5c\") " pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" Feb 17 15:57:52 crc kubenswrapper[4717]: I0217 15:57:52.684512 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" Feb 17 15:57:52 crc kubenswrapper[4717]: W0217 15:57:52.715593 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9562ac3_88f2_46d4_9fc1_2abf553fed5c.slice/crio-92d63683fce3b550cb6949d826079661e16a569c140f2c3ed2d54726479006b1 WatchSource:0}: Error finding container 92d63683fce3b550cb6949d826079661e16a569c140f2c3ed2d54726479006b1: Status 404 returned error can't find the container with id 92d63683fce3b550cb6949d826079661e16a569c140f2c3ed2d54726479006b1 Feb 17 15:57:53 crc kubenswrapper[4717]: I0217 15:57:53.644644 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" event={"ID":"a9562ac3-88f2-46d4-9fc1-2abf553fed5c","Type":"ContainerStarted","Data":"11447aa9e8165114db83dbefa3055034e7d022d98cbc9f8c7e417d5364dbc577"} Feb 17 15:57:53 crc kubenswrapper[4717]: I0217 15:57:53.646135 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" event={"ID":"a9562ac3-88f2-46d4-9fc1-2abf553fed5c","Type":"ContainerStarted","Data":"92d63683fce3b550cb6949d826079661e16a569c140f2c3ed2d54726479006b1"} Feb 17 15:58:00 crc kubenswrapper[4717]: I0217 15:58:00.846543 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:58:00 crc kubenswrapper[4717]: E0217 15:58:00.847330 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:58:13 crc kubenswrapper[4717]: I0217 15:58:13.853121 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:58:13 crc kubenswrapper[4717]: E0217 15:58:13.853946 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:58:26 crc kubenswrapper[4717]: I0217 15:58:26.924370 4717 generic.go:334] "Generic (PLEG): container finished" podID="a9562ac3-88f2-46d4-9fc1-2abf553fed5c" containerID="11447aa9e8165114db83dbefa3055034e7d022d98cbc9f8c7e417d5364dbc577" exitCode=0 Feb 17 15:58:26 crc kubenswrapper[4717]: I0217 15:58:26.924496 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" event={"ID":"a9562ac3-88f2-46d4-9fc1-2abf553fed5c","Type":"ContainerDied","Data":"11447aa9e8165114db83dbefa3055034e7d022d98cbc9f8c7e417d5364dbc577"} Feb 17 15:58:28 crc kubenswrapper[4717]: I0217 15:58:28.062602 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" Feb 17 15:58:28 crc kubenswrapper[4717]: I0217 15:58:28.103071 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hs9kl/crc-debug-d4hh4"] Feb 17 15:58:28 crc kubenswrapper[4717]: I0217 15:58:28.116408 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hs9kl/crc-debug-d4hh4"] Feb 17 15:58:28 crc kubenswrapper[4717]: I0217 15:58:28.188761 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27l8h\" (UniqueName: \"kubernetes.io/projected/a9562ac3-88f2-46d4-9fc1-2abf553fed5c-kube-api-access-27l8h\") pod \"a9562ac3-88f2-46d4-9fc1-2abf553fed5c\" (UID: \"a9562ac3-88f2-46d4-9fc1-2abf553fed5c\") " Feb 17 15:58:28 crc kubenswrapper[4717]: I0217 15:58:28.188980 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9562ac3-88f2-46d4-9fc1-2abf553fed5c-host\") pod \"a9562ac3-88f2-46d4-9fc1-2abf553fed5c\" (UID: \"a9562ac3-88f2-46d4-9fc1-2abf553fed5c\") " Feb 17 15:58:28 crc kubenswrapper[4717]: I0217 15:58:28.189156 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9562ac3-88f2-46d4-9fc1-2abf553fed5c-host" (OuterVolumeSpecName: "host") pod "a9562ac3-88f2-46d4-9fc1-2abf553fed5c" (UID: "a9562ac3-88f2-46d4-9fc1-2abf553fed5c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:58:28 crc kubenswrapper[4717]: I0217 15:58:28.189788 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9562ac3-88f2-46d4-9fc1-2abf553fed5c-host\") on node \"crc\" DevicePath \"\"" Feb 17 15:58:28 crc kubenswrapper[4717]: I0217 15:58:28.194051 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9562ac3-88f2-46d4-9fc1-2abf553fed5c-kube-api-access-27l8h" (OuterVolumeSpecName: "kube-api-access-27l8h") pod "a9562ac3-88f2-46d4-9fc1-2abf553fed5c" (UID: "a9562ac3-88f2-46d4-9fc1-2abf553fed5c"). InnerVolumeSpecName "kube-api-access-27l8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:58:28 crc kubenswrapper[4717]: I0217 15:58:28.291927 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27l8h\" (UniqueName: \"kubernetes.io/projected/a9562ac3-88f2-46d4-9fc1-2abf553fed5c-kube-api-access-27l8h\") on node \"crc\" DevicePath \"\"" Feb 17 15:58:28 crc kubenswrapper[4717]: I0217 15:58:28.847251 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:58:28 crc kubenswrapper[4717]: E0217 15:58:28.847517 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:58:28 crc kubenswrapper[4717]: I0217 15:58:28.945661 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92d63683fce3b550cb6949d826079661e16a569c140f2c3ed2d54726479006b1" Feb 17 15:58:28 crc kubenswrapper[4717]: I0217 15:58:28.945699 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/crc-debug-d4hh4" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.380888 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hs9kl/crc-debug-l9v5d"] Feb 17 15:58:29 crc kubenswrapper[4717]: E0217 15:58:29.381649 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9562ac3-88f2-46d4-9fc1-2abf553fed5c" containerName="container-00" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.381665 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9562ac3-88f2-46d4-9fc1-2abf553fed5c" containerName="container-00" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.381924 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9562ac3-88f2-46d4-9fc1-2abf553fed5c" containerName="container-00" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.382650 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/crc-debug-l9v5d" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.512223 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7bc39db-bf2e-4fca-939f-3c508bca4e8a-host\") pod \"crc-debug-l9v5d\" (UID: \"b7bc39db-bf2e-4fca-939f-3c508bca4e8a\") " pod="openshift-must-gather-hs9kl/crc-debug-l9v5d" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.512488 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wftz4\" (UniqueName: \"kubernetes.io/projected/b7bc39db-bf2e-4fca-939f-3c508bca4e8a-kube-api-access-wftz4\") pod \"crc-debug-l9v5d\" (UID: \"b7bc39db-bf2e-4fca-939f-3c508bca4e8a\") " pod="openshift-must-gather-hs9kl/crc-debug-l9v5d" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.614413 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7bc39db-bf2e-4fca-939f-3c508bca4e8a-host\") pod \"crc-debug-l9v5d\" (UID: \"b7bc39db-bf2e-4fca-939f-3c508bca4e8a\") " pod="openshift-must-gather-hs9kl/crc-debug-l9v5d" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.614730 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wftz4\" (UniqueName: \"kubernetes.io/projected/b7bc39db-bf2e-4fca-939f-3c508bca4e8a-kube-api-access-wftz4\") pod \"crc-debug-l9v5d\" (UID: \"b7bc39db-bf2e-4fca-939f-3c508bca4e8a\") " pod="openshift-must-gather-hs9kl/crc-debug-l9v5d" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.614543 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7bc39db-bf2e-4fca-939f-3c508bca4e8a-host\") pod \"crc-debug-l9v5d\" (UID: \"b7bc39db-bf2e-4fca-939f-3c508bca4e8a\") " pod="openshift-must-gather-hs9kl/crc-debug-l9v5d" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.639392 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wftz4\" (UniqueName: \"kubernetes.io/projected/b7bc39db-bf2e-4fca-939f-3c508bca4e8a-kube-api-access-wftz4\") pod \"crc-debug-l9v5d\" (UID: \"b7bc39db-bf2e-4fca-939f-3c508bca4e8a\") " pod="openshift-must-gather-hs9kl/crc-debug-l9v5d" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.698695 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/crc-debug-l9v5d" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.855061 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9562ac3-88f2-46d4-9fc1-2abf553fed5c" path="/var/lib/kubelet/pods/a9562ac3-88f2-46d4-9fc1-2abf553fed5c/volumes" Feb 17 15:58:29 crc kubenswrapper[4717]: I0217 15:58:29.955612 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hs9kl/crc-debug-l9v5d" event={"ID":"b7bc39db-bf2e-4fca-939f-3c508bca4e8a","Type":"ContainerStarted","Data":"71fa04dce4a8c2bf4a78179b8355c40cec1ba2b21a95c7f6fbb656e6f2b6d850"} Feb 17 15:58:30 crc kubenswrapper[4717]: I0217 15:58:30.966142 4717 generic.go:334] "Generic (PLEG): container finished" podID="b7bc39db-bf2e-4fca-939f-3c508bca4e8a" containerID="f6c6080a47a851c899eae30826866091a4ff362014e9af85d9a5de0b1d288616" exitCode=0 Feb 17 15:58:30 crc kubenswrapper[4717]: I0217 15:58:30.966244 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hs9kl/crc-debug-l9v5d" event={"ID":"b7bc39db-bf2e-4fca-939f-3c508bca4e8a","Type":"ContainerDied","Data":"f6c6080a47a851c899eae30826866091a4ff362014e9af85d9a5de0b1d288616"} Feb 17 15:58:31 crc kubenswrapper[4717]: I0217 15:58:31.336626 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hs9kl/crc-debug-l9v5d"] Feb 17 15:58:31 crc kubenswrapper[4717]: I0217 15:58:31.353430 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hs9kl/crc-debug-l9v5d"] Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.072417 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/crc-debug-l9v5d" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.157979 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7bc39db-bf2e-4fca-939f-3c508bca4e8a-host\") pod \"b7bc39db-bf2e-4fca-939f-3c508bca4e8a\" (UID: \"b7bc39db-bf2e-4fca-939f-3c508bca4e8a\") " Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.158176 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7bc39db-bf2e-4fca-939f-3c508bca4e8a-host" (OuterVolumeSpecName: "host") pod "b7bc39db-bf2e-4fca-939f-3c508bca4e8a" (UID: "b7bc39db-bf2e-4fca-939f-3c508bca4e8a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.158261 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wftz4\" (UniqueName: \"kubernetes.io/projected/b7bc39db-bf2e-4fca-939f-3c508bca4e8a-kube-api-access-wftz4\") pod \"b7bc39db-bf2e-4fca-939f-3c508bca4e8a\" (UID: \"b7bc39db-bf2e-4fca-939f-3c508bca4e8a\") " Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.158714 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7bc39db-bf2e-4fca-939f-3c508bca4e8a-host\") on node \"crc\" DevicePath \"\"" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.166301 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7bc39db-bf2e-4fca-939f-3c508bca4e8a-kube-api-access-wftz4" (OuterVolumeSpecName: "kube-api-access-wftz4") pod "b7bc39db-bf2e-4fca-939f-3c508bca4e8a" (UID: "b7bc39db-bf2e-4fca-939f-3c508bca4e8a"). InnerVolumeSpecName "kube-api-access-wftz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.260115 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wftz4\" (UniqueName: \"kubernetes.io/projected/b7bc39db-bf2e-4fca-939f-3c508bca4e8a-kube-api-access-wftz4\") on node \"crc\" DevicePath \"\"" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.606821 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hs9kl/crc-debug-zw456"] Feb 17 15:58:32 crc kubenswrapper[4717]: E0217 15:58:32.607235 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7bc39db-bf2e-4fca-939f-3c508bca4e8a" containerName="container-00" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.607251 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7bc39db-bf2e-4fca-939f-3c508bca4e8a" containerName="container-00" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.607451 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7bc39db-bf2e-4fca-939f-3c508bca4e8a" containerName="container-00" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.608029 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/crc-debug-zw456" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.666662 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccf9e331-88e5-4b76-939f-6f273a255223-host\") pod \"crc-debug-zw456\" (UID: \"ccf9e331-88e5-4b76-939f-6f273a255223\") " pod="openshift-must-gather-hs9kl/crc-debug-zw456" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.666744 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25x9w\" (UniqueName: \"kubernetes.io/projected/ccf9e331-88e5-4b76-939f-6f273a255223-kube-api-access-25x9w\") pod \"crc-debug-zw456\" (UID: \"ccf9e331-88e5-4b76-939f-6f273a255223\") " pod="openshift-must-gather-hs9kl/crc-debug-zw456" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.768581 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccf9e331-88e5-4b76-939f-6f273a255223-host\") pod \"crc-debug-zw456\" (UID: \"ccf9e331-88e5-4b76-939f-6f273a255223\") " pod="openshift-must-gather-hs9kl/crc-debug-zw456" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.768655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25x9w\" (UniqueName: \"kubernetes.io/projected/ccf9e331-88e5-4b76-939f-6f273a255223-kube-api-access-25x9w\") pod \"crc-debug-zw456\" (UID: \"ccf9e331-88e5-4b76-939f-6f273a255223\") " pod="openshift-must-gather-hs9kl/crc-debug-zw456" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.768732 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccf9e331-88e5-4b76-939f-6f273a255223-host\") pod \"crc-debug-zw456\" (UID: \"ccf9e331-88e5-4b76-939f-6f273a255223\") " pod="openshift-must-gather-hs9kl/crc-debug-zw456" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.783896 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25x9w\" (UniqueName: \"kubernetes.io/projected/ccf9e331-88e5-4b76-939f-6f273a255223-kube-api-access-25x9w\") pod \"crc-debug-zw456\" (UID: \"ccf9e331-88e5-4b76-939f-6f273a255223\") " pod="openshift-must-gather-hs9kl/crc-debug-zw456" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.923312 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/crc-debug-zw456" Feb 17 15:58:32 crc kubenswrapper[4717]: W0217 15:58:32.972423 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccf9e331_88e5_4b76_939f_6f273a255223.slice/crio-31b5e87106f0e524bc8a0d4baa2b103a3f62a47857408925763eb42ca529b7c4 WatchSource:0}: Error finding container 31b5e87106f0e524bc8a0d4baa2b103a3f62a47857408925763eb42ca529b7c4: Status 404 returned error can't find the container with id 31b5e87106f0e524bc8a0d4baa2b103a3f62a47857408925763eb42ca529b7c4 Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.985062 4717 scope.go:117] "RemoveContainer" containerID="f6c6080a47a851c899eae30826866091a4ff362014e9af85d9a5de0b1d288616" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.985200 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/crc-debug-l9v5d" Feb 17 15:58:32 crc kubenswrapper[4717]: I0217 15:58:32.986878 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hs9kl/crc-debug-zw456" event={"ID":"ccf9e331-88e5-4b76-939f-6f273a255223","Type":"ContainerStarted","Data":"31b5e87106f0e524bc8a0d4baa2b103a3f62a47857408925763eb42ca529b7c4"} Feb 17 15:58:33 crc kubenswrapper[4717]: I0217 15:58:33.857721 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7bc39db-bf2e-4fca-939f-3c508bca4e8a" path="/var/lib/kubelet/pods/b7bc39db-bf2e-4fca-939f-3c508bca4e8a/volumes" Feb 17 15:58:33 crc kubenswrapper[4717]: I0217 15:58:33.999576 4717 generic.go:334] "Generic (PLEG): container finished" podID="ccf9e331-88e5-4b76-939f-6f273a255223" containerID="c934c7d040604068557fc1c28cfab04c31709d43c3f331feabbd4df133b3376f" exitCode=0 Feb 17 15:58:33 crc kubenswrapper[4717]: I0217 15:58:33.999619 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hs9kl/crc-debug-zw456" event={"ID":"ccf9e331-88e5-4b76-939f-6f273a255223","Type":"ContainerDied","Data":"c934c7d040604068557fc1c28cfab04c31709d43c3f331feabbd4df133b3376f"} Feb 17 15:58:34 crc kubenswrapper[4717]: I0217 15:58:34.035663 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hs9kl/crc-debug-zw456"] Feb 17 15:58:34 crc kubenswrapper[4717]: I0217 15:58:34.045924 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hs9kl/crc-debug-zw456"] Feb 17 15:58:35 crc kubenswrapper[4717]: I0217 15:58:35.110359 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/crc-debug-zw456" Feb 17 15:58:35 crc kubenswrapper[4717]: I0217 15:58:35.211581 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25x9w\" (UniqueName: \"kubernetes.io/projected/ccf9e331-88e5-4b76-939f-6f273a255223-kube-api-access-25x9w\") pod \"ccf9e331-88e5-4b76-939f-6f273a255223\" (UID: \"ccf9e331-88e5-4b76-939f-6f273a255223\") " Feb 17 15:58:35 crc kubenswrapper[4717]: I0217 15:58:35.211831 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccf9e331-88e5-4b76-939f-6f273a255223-host\") pod \"ccf9e331-88e5-4b76-939f-6f273a255223\" (UID: \"ccf9e331-88e5-4b76-939f-6f273a255223\") " Feb 17 15:58:35 crc kubenswrapper[4717]: I0217 15:58:35.211944 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ccf9e331-88e5-4b76-939f-6f273a255223-host" (OuterVolumeSpecName: "host") pod "ccf9e331-88e5-4b76-939f-6f273a255223" (UID: "ccf9e331-88e5-4b76-939f-6f273a255223"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:58:35 crc kubenswrapper[4717]: I0217 15:58:35.212241 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccf9e331-88e5-4b76-939f-6f273a255223-host\") on node \"crc\" DevicePath \"\"" Feb 17 15:58:35 crc kubenswrapper[4717]: I0217 15:58:35.219125 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf9e331-88e5-4b76-939f-6f273a255223-kube-api-access-25x9w" (OuterVolumeSpecName: "kube-api-access-25x9w") pod "ccf9e331-88e5-4b76-939f-6f273a255223" (UID: "ccf9e331-88e5-4b76-939f-6f273a255223"). InnerVolumeSpecName "kube-api-access-25x9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:58:35 crc kubenswrapper[4717]: I0217 15:58:35.313722 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25x9w\" (UniqueName: \"kubernetes.io/projected/ccf9e331-88e5-4b76-939f-6f273a255223-kube-api-access-25x9w\") on node \"crc\" DevicePath \"\"" Feb 17 15:58:35 crc kubenswrapper[4717]: I0217 15:58:35.860577 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccf9e331-88e5-4b76-939f-6f273a255223" path="/var/lib/kubelet/pods/ccf9e331-88e5-4b76-939f-6f273a255223/volumes" Feb 17 15:58:36 crc kubenswrapper[4717]: I0217 15:58:36.021301 4717 scope.go:117] "RemoveContainer" containerID="c934c7d040604068557fc1c28cfab04c31709d43c3f331feabbd4df133b3376f" Feb 17 15:58:36 crc kubenswrapper[4717]: I0217 15:58:36.021433 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/crc-debug-zw456" Feb 17 15:58:40 crc kubenswrapper[4717]: I0217 15:58:40.847327 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:58:40 crc kubenswrapper[4717]: E0217 15:58:40.848204 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.719065 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vf2t6"] Feb 17 15:58:43 crc kubenswrapper[4717]: E0217 15:58:43.719887 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf9e331-88e5-4b76-939f-6f273a255223" containerName="container-00" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.719899 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf9e331-88e5-4b76-939f-6f273a255223" containerName="container-00" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.720106 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccf9e331-88e5-4b76-939f-6f273a255223" containerName="container-00" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.722448 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.740954 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vf2t6"] Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.788868 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-utilities\") pod \"redhat-operators-vf2t6\" (UID: \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\") " pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.789242 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f599j\" (UniqueName: \"kubernetes.io/projected/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-kube-api-access-f599j\") pod \"redhat-operators-vf2t6\" (UID: \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\") " pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.789545 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-catalog-content\") pod \"redhat-operators-vf2t6\" (UID: \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\") " pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.890961 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-catalog-content\") pod \"redhat-operators-vf2t6\" (UID: \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\") " pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.891152 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-utilities\") pod \"redhat-operators-vf2t6\" (UID: \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\") " pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.891170 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f599j\" (UniqueName: \"kubernetes.io/projected/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-kube-api-access-f599j\") pod \"redhat-operators-vf2t6\" (UID: \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\") " pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.891845 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-catalog-content\") pod \"redhat-operators-vf2t6\" (UID: \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\") " pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.892055 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-utilities\") pod \"redhat-operators-vf2t6\" (UID: \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\") " pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:43 crc kubenswrapper[4717]: I0217 15:58:43.927010 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f599j\" (UniqueName: \"kubernetes.io/projected/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-kube-api-access-f599j\") pod \"redhat-operators-vf2t6\" (UID: \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\") " pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:44 crc kubenswrapper[4717]: I0217 15:58:44.079617 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:44 crc kubenswrapper[4717]: I0217 15:58:44.539089 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vf2t6"] Feb 17 15:58:45 crc kubenswrapper[4717]: I0217 15:58:45.105219 4717 generic.go:334] "Generic (PLEG): container finished" podID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" containerID="3da34137b781f695d22c1bc7b4eaedf78a9ff348ffae94c96a9ba603226064af" exitCode=0 Feb 17 15:58:45 crc kubenswrapper[4717]: I0217 15:58:45.105271 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf2t6" event={"ID":"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd","Type":"ContainerDied","Data":"3da34137b781f695d22c1bc7b4eaedf78a9ff348ffae94c96a9ba603226064af"} Feb 17 15:58:45 crc kubenswrapper[4717]: I0217 15:58:45.107219 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf2t6" event={"ID":"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd","Type":"ContainerStarted","Data":"2e33e2c085570f3e3d20f8c0bbc3e456d270b85f09217a65c803084713ebef7e"} Feb 17 15:58:45 crc kubenswrapper[4717]: I0217 15:58:45.107803 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:58:46 crc kubenswrapper[4717]: I0217 15:58:46.117432 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf2t6" event={"ID":"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd","Type":"ContainerStarted","Data":"008bafd72687cd4337deb45a0c5f8b4351b30d1ca41d677a797c007e3b431c79"} Feb 17 15:58:51 crc kubenswrapper[4717]: I0217 15:58:51.162760 4717 generic.go:334] "Generic (PLEG): container finished" podID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" containerID="008bafd72687cd4337deb45a0c5f8b4351b30d1ca41d677a797c007e3b431c79" exitCode=0 Feb 17 15:58:51 crc kubenswrapper[4717]: I0217 15:58:51.162824 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf2t6" event={"ID":"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd","Type":"ContainerDied","Data":"008bafd72687cd4337deb45a0c5f8b4351b30d1ca41d677a797c007e3b431c79"} Feb 17 15:58:51 crc kubenswrapper[4717]: I0217 15:58:51.847247 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:58:51 crc kubenswrapper[4717]: E0217 15:58:51.847643 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:58:52 crc kubenswrapper[4717]: I0217 15:58:52.174853 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf2t6" event={"ID":"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd","Type":"ContainerStarted","Data":"b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145"} Feb 17 15:58:52 crc kubenswrapper[4717]: I0217 15:58:52.201929 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vf2t6" podStartSLOduration=2.702343307 podStartE2EDuration="9.201906366s" podCreationTimestamp="2026-02-17 15:58:43 +0000 UTC" firstStartedPulling="2026-02-17 15:58:45.107602966 +0000 UTC m=+3991.523443442" lastFinishedPulling="2026-02-17 15:58:51.607166015 +0000 UTC m=+3998.023006501" observedRunningTime="2026-02-17 15:58:52.19746974 +0000 UTC m=+3998.613310226" watchObservedRunningTime="2026-02-17 15:58:52.201906366 +0000 UTC m=+3998.617746882" Feb 17 15:58:54 crc kubenswrapper[4717]: I0217 15:58:54.080296 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:54 crc kubenswrapper[4717]: I0217 15:58:54.080647 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:58:55 crc kubenswrapper[4717]: I0217 15:58:55.125205 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vf2t6" podUID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" containerName="registry-server" probeResult="failure" output=< Feb 17 15:58:55 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 17 15:58:55 crc kubenswrapper[4717]: > Feb 17 15:59:04 crc kubenswrapper[4717]: I0217 15:59:04.149685 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:59:04 crc kubenswrapper[4717]: I0217 15:59:04.206923 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:59:04 crc kubenswrapper[4717]: I0217 15:59:04.400071 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vf2t6"] Feb 17 15:59:04 crc kubenswrapper[4717]: I0217 15:59:04.847845 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:59:04 crc kubenswrapper[4717]: E0217 15:59:04.851168 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:59:05 crc kubenswrapper[4717]: I0217 15:59:05.322957 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vf2t6" podUID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" containerName="registry-server" containerID="cri-o://b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145" gracePeriod=2 Feb 17 15:59:05 crc kubenswrapper[4717]: I0217 15:59:05.896003 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:59:05 crc kubenswrapper[4717]: I0217 15:59:05.949306 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-utilities\") pod \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\" (UID: \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\") " Feb 17 15:59:05 crc kubenswrapper[4717]: I0217 15:59:05.949384 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-catalog-content\") pod \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\" (UID: \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\") " Feb 17 15:59:05 crc kubenswrapper[4717]: I0217 15:59:05.949674 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f599j\" (UniqueName: \"kubernetes.io/projected/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-kube-api-access-f599j\") pod \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\" (UID: \"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd\") " Feb 17 15:59:05 crc kubenswrapper[4717]: I0217 15:59:05.951563 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-utilities" (OuterVolumeSpecName: "utilities") pod "c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" (UID: "c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:59:05 crc kubenswrapper[4717]: I0217 15:59:05.957229 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-kube-api-access-f599j" (OuterVolumeSpecName: "kube-api-access-f599j") pod "c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" (UID: "c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd"). InnerVolumeSpecName "kube-api-access-f599j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.054347 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f599j\" (UniqueName: \"kubernetes.io/projected/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-kube-api-access-f599j\") on node \"crc\" DevicePath \"\"" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.054390 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.091853 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" (UID: "c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.156032 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.334263 4717 generic.go:334] "Generic (PLEG): container finished" podID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" containerID="b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145" exitCode=0 Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.334319 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vf2t6" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.334338 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf2t6" event={"ID":"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd","Type":"ContainerDied","Data":"b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145"} Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.334384 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf2t6" event={"ID":"c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd","Type":"ContainerDied","Data":"2e33e2c085570f3e3d20f8c0bbc3e456d270b85f09217a65c803084713ebef7e"} Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.334401 4717 scope.go:117] "RemoveContainer" containerID="b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.355638 4717 scope.go:117] "RemoveContainer" containerID="008bafd72687cd4337deb45a0c5f8b4351b30d1ca41d677a797c007e3b431c79" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.377952 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vf2t6"] Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.387738 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vf2t6"] Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.397787 4717 scope.go:117] "RemoveContainer" containerID="3da34137b781f695d22c1bc7b4eaedf78a9ff348ffae94c96a9ba603226064af" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.447543 4717 scope.go:117] "RemoveContainer" containerID="b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145" Feb 17 15:59:06 crc kubenswrapper[4717]: E0217 15:59:06.447928 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145\": container with ID starting with b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145 not found: ID does not exist" containerID="b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.447965 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145"} err="failed to get container status \"b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145\": rpc error: code = NotFound desc = could not find container \"b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145\": container with ID starting with b9f74ae90c2f767d7352876a91047276c8654524e2140be270a5fac0923b4145 not found: ID does not exist" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.447992 4717 scope.go:117] "RemoveContainer" containerID="008bafd72687cd4337deb45a0c5f8b4351b30d1ca41d677a797c007e3b431c79" Feb 17 15:59:06 crc kubenswrapper[4717]: E0217 15:59:06.453693 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008bafd72687cd4337deb45a0c5f8b4351b30d1ca41d677a797c007e3b431c79\": container with ID starting with 008bafd72687cd4337deb45a0c5f8b4351b30d1ca41d677a797c007e3b431c79 not found: ID does not exist" containerID="008bafd72687cd4337deb45a0c5f8b4351b30d1ca41d677a797c007e3b431c79" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.453751 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008bafd72687cd4337deb45a0c5f8b4351b30d1ca41d677a797c007e3b431c79"} err="failed to get container status \"008bafd72687cd4337deb45a0c5f8b4351b30d1ca41d677a797c007e3b431c79\": rpc error: code = NotFound desc = could not find container \"008bafd72687cd4337deb45a0c5f8b4351b30d1ca41d677a797c007e3b431c79\": container with ID starting with 008bafd72687cd4337deb45a0c5f8b4351b30d1ca41d677a797c007e3b431c79 not found: ID does not exist" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.453780 4717 scope.go:117] "RemoveContainer" containerID="3da34137b781f695d22c1bc7b4eaedf78a9ff348ffae94c96a9ba603226064af" Feb 17 15:59:06 crc kubenswrapper[4717]: E0217 15:59:06.454204 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da34137b781f695d22c1bc7b4eaedf78a9ff348ffae94c96a9ba603226064af\": container with ID starting with 3da34137b781f695d22c1bc7b4eaedf78a9ff348ffae94c96a9ba603226064af not found: ID does not exist" containerID="3da34137b781f695d22c1bc7b4eaedf78a9ff348ffae94c96a9ba603226064af" Feb 17 15:59:06 crc kubenswrapper[4717]: I0217 15:59:06.454236 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da34137b781f695d22c1bc7b4eaedf78a9ff348ffae94c96a9ba603226064af"} err="failed to get container status \"3da34137b781f695d22c1bc7b4eaedf78a9ff348ffae94c96a9ba603226064af\": rpc error: code = NotFound desc = could not find container \"3da34137b781f695d22c1bc7b4eaedf78a9ff348ffae94c96a9ba603226064af\": container with ID starting with 3da34137b781f695d22c1bc7b4eaedf78a9ff348ffae94c96a9ba603226064af not found: ID does not exist" Feb 17 15:59:07 crc kubenswrapper[4717]: I0217 15:59:07.856715 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" path="/var/lib/kubelet/pods/c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd/volumes" Feb 17 15:59:07 crc kubenswrapper[4717]: I0217 15:59:07.878167 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-dc9f5cc6-vmvkc_24915ee8-ac0d-4c45-826f-6ca18351c1fd/barbican-api/0.log" Feb 17 15:59:08 crc kubenswrapper[4717]: I0217 15:59:08.040533 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-dc9f5cc6-vmvkc_24915ee8-ac0d-4c45-826f-6ca18351c1fd/barbican-api-log/0.log" Feb 17 15:59:08 crc kubenswrapper[4717]: I0217 15:59:08.108930 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-678d9d64d6-sj9md_492fb01b-1889-47ff-b334-750b9614cc60/barbican-keystone-listener/0.log" Feb 17 15:59:08 crc kubenswrapper[4717]: I0217 15:59:08.305295 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-678d9d64d6-sj9md_492fb01b-1889-47ff-b334-750b9614cc60/barbican-keystone-listener-log/0.log" Feb 17 15:59:08 crc kubenswrapper[4717]: I0217 15:59:08.345236 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-96ddcd589-gq7qg_0b3b8192-afd0-44c1-886f-7dc072112460/barbican-worker/0.log" Feb 17 15:59:08 crc kubenswrapper[4717]: I0217 15:59:08.379784 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-96ddcd589-gq7qg_0b3b8192-afd0-44c1-886f-7dc072112460/barbican-worker-log/0.log" Feb 17 15:59:08 crc kubenswrapper[4717]: I0217 15:59:08.566846 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-p57nk_fc967ec5-ea6b-4da3-a6a4-c0a75f0a6d0c/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:08 crc kubenswrapper[4717]: I0217 15:59:08.621770 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fa734ee-b468-4462-850d-9f347c991241/ceilometer-central-agent/0.log" Feb 17 15:59:08 crc kubenswrapper[4717]: I0217 15:59:08.738708 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fa734ee-b468-4462-850d-9f347c991241/ceilometer-notification-agent/0.log" Feb 17 15:59:08 crc kubenswrapper[4717]: I0217 15:59:08.778545 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fa734ee-b468-4462-850d-9f347c991241/proxy-httpd/0.log" Feb 17 15:59:08 crc kubenswrapper[4717]: I0217 15:59:08.814463 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6fa734ee-b468-4462-850d-9f347c991241/sg-core/0.log" Feb 17 15:59:08 crc kubenswrapper[4717]: I0217 15:59:08.959878 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_880f22f8-f3a5-479a-b456-7afdd5e7d96e/cinder-api/0.log" Feb 17 15:59:08 crc kubenswrapper[4717]: I0217 15:59:08.999832 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_880f22f8-f3a5-479a-b456-7afdd5e7d96e/cinder-api-log/0.log" Feb 17 15:59:09 crc kubenswrapper[4717]: I0217 15:59:09.099368 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7eef2c95-16ed-4b57-a95a-aa5b302ec564/cinder-scheduler/0.log" Feb 17 15:59:09 crc kubenswrapper[4717]: I0217 15:59:09.178344 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7eef2c95-16ed-4b57-a95a-aa5b302ec564/probe/0.log" Feb 17 15:59:09 crc kubenswrapper[4717]: I0217 15:59:09.330342 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lpwhh_ce8505ef-a14d-4936-b9c2-4334b5cf69b1/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:09 crc kubenswrapper[4717]: I0217 15:59:09.377304 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5kzz4_d0c4edef-fee0-490a-8c25-9e4c9950c04f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:09 crc kubenswrapper[4717]: I0217 15:59:09.504846 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-dncpb_394b7b44-48ed-406e-be48-6f7cffabfaf9/init/0.log" Feb 17 15:59:09 crc kubenswrapper[4717]: I0217 15:59:09.683670 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-dncpb_394b7b44-48ed-406e-be48-6f7cffabfaf9/init/0.log" Feb 17 15:59:09 crc kubenswrapper[4717]: I0217 15:59:09.740482 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-dncpb_394b7b44-48ed-406e-be48-6f7cffabfaf9/dnsmasq-dns/0.log" Feb 17 15:59:09 crc kubenswrapper[4717]: I0217 15:59:09.761501 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-xt86s_6f01b139-f61b-4935-930c-65756bd54cdc/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:09 crc kubenswrapper[4717]: I0217 15:59:09.935127 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f/glance-httpd/0.log" Feb 17 15:59:09 crc kubenswrapper[4717]: I0217 15:59:09.961564 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f91f5d0e-cbf8-4977-b9a6-1bfffe081b2f/glance-log/0.log" Feb 17 15:59:10 crc kubenswrapper[4717]: I0217 15:59:10.082208 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cb47f318-2779-4288-b0ce-775766436b6b/glance-httpd/0.log" Feb 17 15:59:10 crc kubenswrapper[4717]: I0217 15:59:10.144963 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cb47f318-2779-4288-b0ce-775766436b6b/glance-log/0.log" Feb 17 15:59:10 crc kubenswrapper[4717]: I0217 15:59:10.319938 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-85b46995b-rj5bq_04bb64de-6640-4f6a-9052-ff0edf9dacb8/horizon/0.log" Feb 17 15:59:10 crc kubenswrapper[4717]: I0217 15:59:10.443649 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-9t5hk_8d553699-af50-41f7-b4da-1e0182788f60/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:10 crc kubenswrapper[4717]: I0217 15:59:10.680235 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mgr9p_7b728f97-9d73-433e-a910-13591303221e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:10 crc kubenswrapper[4717]: I0217 15:59:10.685239 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-85b46995b-rj5bq_04bb64de-6640-4f6a-9052-ff0edf9dacb8/horizon-log/0.log" Feb 17 15:59:11 crc kubenswrapper[4717]: I0217 15:59:11.037801 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-96f9cc575-vd9jv_c8ee178e-3086-48e3-ad91-aefa00e0d10e/keystone-api/0.log" Feb 17 15:59:11 crc kubenswrapper[4717]: I0217 15:59:11.136627 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_10b56503-0d1b-474b-8b7c-0d07f5eae27b/kube-state-metrics/0.log" Feb 17 15:59:11 crc kubenswrapper[4717]: I0217 15:59:11.564661 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ts54t_fff26fba-baaf-4ed4-9c5b-b6dec300d19c/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:12 crc kubenswrapper[4717]: I0217 15:59:12.258819 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-kzvxb_e4ab62d3-2bec-418d-ad69-7d384f86652c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:12 crc kubenswrapper[4717]: I0217 15:59:12.344864 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b696c957f-jwh8l_d5fc5d05-07ba-476e-835e-61dfa2e9edc1/neutron-httpd/0.log" Feb 17 15:59:12 crc kubenswrapper[4717]: I0217 15:59:12.394020 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b696c957f-jwh8l_d5fc5d05-07ba-476e-835e-61dfa2e9edc1/neutron-api/0.log" Feb 17 15:59:12 crc kubenswrapper[4717]: I0217 15:59:12.988825 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a9a8f201-b55e-47e3-9d85-18d73631f9ae/nova-api-log/0.log" Feb 17 15:59:13 crc kubenswrapper[4717]: I0217 15:59:13.002881 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_18f24db6-956b-479c-88b5-283ae2b17f4d/nova-cell0-conductor-conductor/0.log" Feb 17 15:59:13 crc kubenswrapper[4717]: I0217 15:59:13.278412 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a9a8f201-b55e-47e3-9d85-18d73631f9ae/nova-api-api/0.log" Feb 17 15:59:13 crc kubenswrapper[4717]: I0217 15:59:13.281791 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_731c9760-bb30-4dd0-b246-0fb9ed312ae9/nova-cell1-conductor-conductor/0.log" Feb 17 15:59:13 crc kubenswrapper[4717]: I0217 15:59:13.345312 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_90d942b5-ae77-4210-b456-ca573622fc06/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 15:59:13 crc kubenswrapper[4717]: I0217 15:59:13.770107 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-sftg9_377efa15-97db-4618-85fd-1185cefde9a7/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:13 crc kubenswrapper[4717]: I0217 15:59:13.877337 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f3c5ad1f-898b-4643-80a5-6946068bf842/nova-metadata-log/0.log" Feb 17 15:59:14 crc kubenswrapper[4717]: I0217 15:59:14.239321 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a119d602-10c8-4b7b-aa61-77774c7f024f/nova-scheduler-scheduler/0.log" Feb 17 15:59:14 crc kubenswrapper[4717]: I0217 15:59:14.297135 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_14b7d39d-8183-4e96-a163-b72323ccb0b5/mysql-bootstrap/0.log" Feb 17 15:59:14 crc kubenswrapper[4717]: I0217 15:59:14.429290 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_14b7d39d-8183-4e96-a163-b72323ccb0b5/mysql-bootstrap/0.log" Feb 17 15:59:14 crc kubenswrapper[4717]: I0217 15:59:14.429752 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_14b7d39d-8183-4e96-a163-b72323ccb0b5/galera/0.log" Feb 17 15:59:14 crc kubenswrapper[4717]: I0217 15:59:14.840952 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8ec72b87-16f5-487e-ae08-a52b5d289bee/mysql-bootstrap/0.log" Feb 17 15:59:14 crc kubenswrapper[4717]: I0217 15:59:14.940427 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8ec72b87-16f5-487e-ae08-a52b5d289bee/mysql-bootstrap/0.log" Feb 17 15:59:14 crc kubenswrapper[4717]: I0217 15:59:14.989008 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8ec72b87-16f5-487e-ae08-a52b5d289bee/galera/0.log" Feb 17 15:59:15 crc kubenswrapper[4717]: I0217 15:59:15.115617 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8cf42885-3509-4779-901c-e88f11c5fdfd/openstackclient/0.log" Feb 17 15:59:15 crc kubenswrapper[4717]: I0217 15:59:15.323859 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-k4zc7_edd6eb53-55b7-4a61-867c-e4bf277af963/ovn-controller/0.log" Feb 17 15:59:15 crc kubenswrapper[4717]: I0217 15:59:15.370927 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f3c5ad1f-898b-4643-80a5-6946068bf842/nova-metadata-metadata/0.log" Feb 17 15:59:15 crc kubenswrapper[4717]: I0217 15:59:15.422257 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xcmtm_e46114b9-a8e8-4c90-bc25-4a3c3a4a4d74/openstack-network-exporter/0.log" Feb 17 15:59:15 crc kubenswrapper[4717]: I0217 15:59:15.671972 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5r2q4_1954ed93-1aa6-4c08-8379-01d047f5da20/ovsdb-server-init/0.log" Feb 17 15:59:15 crc kubenswrapper[4717]: I0217 15:59:15.842863 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5r2q4_1954ed93-1aa6-4c08-8379-01d047f5da20/ovs-vswitchd/0.log" Feb 17 15:59:15 crc kubenswrapper[4717]: I0217 15:59:15.923739 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5r2q4_1954ed93-1aa6-4c08-8379-01d047f5da20/ovsdb-server-init/0.log" Feb 17 15:59:15 crc kubenswrapper[4717]: I0217 15:59:15.948965 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5r2q4_1954ed93-1aa6-4c08-8379-01d047f5da20/ovsdb-server/0.log" Feb 17 15:59:16 crc kubenswrapper[4717]: I0217 15:59:16.173673 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-mvpm4_f366c26c-4b32-488e-8738-dbbf0ddd3adc/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:16 crc kubenswrapper[4717]: I0217 15:59:16.200921 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a56b9da3-623b-44df-861c-62c9b45566db/openstack-network-exporter/0.log" Feb 17 15:59:16 crc kubenswrapper[4717]: I0217 15:59:16.215205 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a56b9da3-623b-44df-861c-62c9b45566db/ovn-northd/0.log" Feb 17 15:59:16 crc kubenswrapper[4717]: I0217 15:59:16.398115 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7a71caa4-7d53-466e-8d74-98c814d3afda/openstack-network-exporter/0.log" Feb 17 15:59:16 crc kubenswrapper[4717]: I0217 15:59:16.449396 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7a71caa4-7d53-466e-8d74-98c814d3afda/ovsdbserver-nb/0.log" Feb 17 15:59:16 crc kubenswrapper[4717]: I0217 15:59:16.582113 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4/openstack-network-exporter/0.log" Feb 17 15:59:16 crc kubenswrapper[4717]: I0217 15:59:16.676547 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_82b6adf7-3c9c-4a5d-8609-9a6d86d54ff4/ovsdbserver-sb/0.log" Feb 17 15:59:16 crc kubenswrapper[4717]: I0217 15:59:16.807158 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7fb4dbdc46-npdpn_c1670724-02ac-4de2-b161-c2b78ecb9bb0/placement-api/0.log" Feb 17 15:59:16 crc kubenswrapper[4717]: I0217 15:59:16.850273 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7fb4dbdc46-npdpn_c1670724-02ac-4de2-b161-c2b78ecb9bb0/placement-log/0.log" Feb 17 15:59:16 crc kubenswrapper[4717]: I0217 15:59:16.978797 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b/setup-container/0.log" Feb 17 15:59:17 crc kubenswrapper[4717]: I0217 15:59:17.172829 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b/rabbitmq/0.log" Feb 17 15:59:17 crc kubenswrapper[4717]: I0217 15:59:17.191819 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7/setup-container/0.log" Feb 17 15:59:17 crc kubenswrapper[4717]: I0217 15:59:17.212479 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dfbeb262-7d4a-4ec3-ac8b-0b46e0e58f2b/setup-container/0.log" Feb 17 15:59:17 crc kubenswrapper[4717]: I0217 15:59:17.471706 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7/rabbitmq/0.log" Feb 17 15:59:17 crc kubenswrapper[4717]: I0217 15:59:17.483001 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_93a4dd31-2da7-4bf5-8ad0-6c17ec0fcba7/setup-container/0.log" Feb 17 15:59:17 crc kubenswrapper[4717]: I0217 15:59:17.502984 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-kgdxp_dcbb0902-ee67-4df1-b420-f299e4400354/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:17 crc kubenswrapper[4717]: I0217 15:59:17.732965 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-678sx_2047a6dc-1a1c-4b26-bee6-c16d812a99df/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:17 crc kubenswrapper[4717]: I0217 15:59:17.749676 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-qzzdm_de72738c-0584-4539-9bd9-92382a0f5538/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:17 crc kubenswrapper[4717]: I0217 15:59:17.848638 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:59:17 crc kubenswrapper[4717]: E0217 15:59:17.848839 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:59:17 crc kubenswrapper[4717]: I0217 15:59:17.974443 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-dbw5h_c60befe9-ac76-4ba1-9cd0-15154e3c4e7a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:18 crc kubenswrapper[4717]: I0217 15:59:18.058693 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-6rtxk_205b31b1-1e73-466f-9ede-0248217b4356/ssh-known-hosts-edpm-deployment/0.log" Feb 17 15:59:18 crc kubenswrapper[4717]: I0217 15:59:18.332786 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-874f74f55-n92h5_8206d35f-44b8-45f5-9286-8e4179701b96/proxy-httpd/0.log" Feb 17 15:59:18 crc kubenswrapper[4717]: I0217 15:59:18.361627 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-874f74f55-n92h5_8206d35f-44b8-45f5-9286-8e4179701b96/proxy-server/0.log" Feb 17 15:59:18 crc kubenswrapper[4717]: I0217 15:59:18.559017 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-rhzj6_929feb4c-fd82-4293-9a44-a6f53816cdae/swift-ring-rebalance/0.log" Feb 17 15:59:18 crc kubenswrapper[4717]: I0217 15:59:18.655574 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/account-auditor/0.log" Feb 17 15:59:18 crc kubenswrapper[4717]: I0217 15:59:18.745155 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/account-reaper/0.log" Feb 17 15:59:18 crc kubenswrapper[4717]: I0217 15:59:18.829812 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/account-replicator/0.log" Feb 17 15:59:18 crc kubenswrapper[4717]: I0217 15:59:18.902822 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/container-auditor/0.log" Feb 17 15:59:18 crc kubenswrapper[4717]: I0217 15:59:18.910280 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/account-server/0.log" Feb 17 15:59:18 crc kubenswrapper[4717]: I0217 15:59:18.942280 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/container-replicator/0.log" Feb 17 15:59:19 crc kubenswrapper[4717]: I0217 15:59:19.061011 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/container-server/0.log" Feb 17 15:59:19 crc kubenswrapper[4717]: I0217 15:59:19.124970 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/container-updater/0.log" Feb 17 15:59:19 crc kubenswrapper[4717]: I0217 15:59:19.140322 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/object-auditor/0.log" Feb 17 15:59:19 crc kubenswrapper[4717]: I0217 15:59:19.151921 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/object-expirer/0.log" Feb 17 15:59:19 crc kubenswrapper[4717]: I0217 15:59:19.316269 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/object-server/0.log" Feb 17 15:59:19 crc kubenswrapper[4717]: I0217 15:59:19.320570 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/object-replicator/0.log" Feb 17 15:59:19 crc kubenswrapper[4717]: I0217 15:59:19.338862 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/object-updater/0.log" Feb 17 15:59:19 crc kubenswrapper[4717]: I0217 15:59:19.387503 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/rsync/0.log" Feb 17 15:59:19 crc kubenswrapper[4717]: I0217 15:59:19.605989 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nvhsv_92941f8a-938e-41b2-ae92-ea490e7050d9/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:19 crc kubenswrapper[4717]: I0217 15:59:19.609945 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_518c6b94-614f-42fd-9016-122cdcfcb8c9/swift-recon-cron/0.log" Feb 17 15:59:19 crc kubenswrapper[4717]: I0217 15:59:19.805330 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_6bc8b07d-6032-43fa-821d-fa1685427d56/tempest-tests-tempest-tests-runner/0.log" Feb 17 15:59:19 crc kubenswrapper[4717]: I0217 15:59:19.824956 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2f75b776-5a46-4241-adb5-eb50dbc8ba0f/test-operator-logs-container/0.log" Feb 17 15:59:20 crc kubenswrapper[4717]: I0217 15:59:20.017630 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-frrhf_6dfba30c-cf0e-4165-bc37-8284cc15b50f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 15:59:28 crc kubenswrapper[4717]: I0217 15:59:28.026875 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0ae80c8e-65ba-4b6b-a3d6-6dcb215aca1f/memcached/0.log" Feb 17 15:59:29 crc kubenswrapper[4717]: I0217 15:59:29.847455 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:59:29 crc kubenswrapper[4717]: E0217 15:59:29.848478 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.083125 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bjvsr"] Feb 17 15:59:39 crc kubenswrapper[4717]: E0217 15:59:39.084322 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" containerName="extract-utilities" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.084346 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" containerName="extract-utilities" Feb 17 15:59:39 crc kubenswrapper[4717]: E0217 15:59:39.084368 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" containerName="extract-content" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.084382 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" containerName="extract-content" Feb 17 15:59:39 crc kubenswrapper[4717]: E0217 15:59:39.084403 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" containerName="registry-server" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.084415 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" containerName="registry-server" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.084794 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fde987-bbb2-4e90-b3a8-ca1f22c79fbd" containerName="registry-server" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.087164 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.105475 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjvsr"] Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.240036 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b315d7-5138-42f3-8d6f-0e06ac2ede33-utilities\") pod \"redhat-marketplace-bjvsr\" (UID: \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\") " pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.240570 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7vxq\" (UniqueName: \"kubernetes.io/projected/70b315d7-5138-42f3-8d6f-0e06ac2ede33-kube-api-access-n7vxq\") pod \"redhat-marketplace-bjvsr\" (UID: \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\") " pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.240728 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b315d7-5138-42f3-8d6f-0e06ac2ede33-catalog-content\") pod \"redhat-marketplace-bjvsr\" (UID: \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\") " pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.343038 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b315d7-5138-42f3-8d6f-0e06ac2ede33-utilities\") pod \"redhat-marketplace-bjvsr\" (UID: \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\") " pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.343469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7vxq\" (UniqueName: \"kubernetes.io/projected/70b315d7-5138-42f3-8d6f-0e06ac2ede33-kube-api-access-n7vxq\") pod \"redhat-marketplace-bjvsr\" (UID: \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\") " pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.343593 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b315d7-5138-42f3-8d6f-0e06ac2ede33-catalog-content\") pod \"redhat-marketplace-bjvsr\" (UID: \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\") " pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.343641 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b315d7-5138-42f3-8d6f-0e06ac2ede33-utilities\") pod \"redhat-marketplace-bjvsr\" (UID: \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\") " pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.343918 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b315d7-5138-42f3-8d6f-0e06ac2ede33-catalog-content\") pod \"redhat-marketplace-bjvsr\" (UID: \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\") " pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.522485 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7vxq\" (UniqueName: \"kubernetes.io/projected/70b315d7-5138-42f3-8d6f-0e06ac2ede33-kube-api-access-n7vxq\") pod \"redhat-marketplace-bjvsr\" (UID: \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\") " pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:39 crc kubenswrapper[4717]: I0217 15:59:39.715808 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:40 crc kubenswrapper[4717]: I0217 15:59:40.191468 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjvsr"] Feb 17 15:59:40 crc kubenswrapper[4717]: I0217 15:59:40.631927 4717 generic.go:334] "Generic (PLEG): container finished" podID="70b315d7-5138-42f3-8d6f-0e06ac2ede33" containerID="0722dd950c329f1f3b123e6ae12b4ae1efaf8660984acd1f9ffe4bb533956aff" exitCode=0 Feb 17 15:59:40 crc kubenswrapper[4717]: I0217 15:59:40.632029 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjvsr" event={"ID":"70b315d7-5138-42f3-8d6f-0e06ac2ede33","Type":"ContainerDied","Data":"0722dd950c329f1f3b123e6ae12b4ae1efaf8660984acd1f9ffe4bb533956aff"} Feb 17 15:59:40 crc kubenswrapper[4717]: I0217 15:59:40.632422 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjvsr" event={"ID":"70b315d7-5138-42f3-8d6f-0e06ac2ede33","Type":"ContainerStarted","Data":"a710b70ea459507f73c8b67db303b6e4dbf6950390196325786e0404931a495c"} Feb 17 15:59:40 crc kubenswrapper[4717]: I0217 15:59:40.846767 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:59:40 crc kubenswrapper[4717]: E0217 15:59:40.847141 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:59:41 crc kubenswrapper[4717]: I0217 15:59:41.643690 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjvsr" event={"ID":"70b315d7-5138-42f3-8d6f-0e06ac2ede33","Type":"ContainerStarted","Data":"72c2626867af1a28ef81b30ac52009f11f854d207e1f94583affd355799ddad4"} Feb 17 15:59:42 crc kubenswrapper[4717]: I0217 15:59:42.653826 4717 generic.go:334] "Generic (PLEG): container finished" podID="70b315d7-5138-42f3-8d6f-0e06ac2ede33" containerID="72c2626867af1a28ef81b30ac52009f11f854d207e1f94583affd355799ddad4" exitCode=0 Feb 17 15:59:42 crc kubenswrapper[4717]: I0217 15:59:42.653939 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjvsr" event={"ID":"70b315d7-5138-42f3-8d6f-0e06ac2ede33","Type":"ContainerDied","Data":"72c2626867af1a28ef81b30ac52009f11f854d207e1f94583affd355799ddad4"} Feb 17 15:59:43 crc kubenswrapper[4717]: I0217 15:59:43.665872 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjvsr" event={"ID":"70b315d7-5138-42f3-8d6f-0e06ac2ede33","Type":"ContainerStarted","Data":"68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8"} Feb 17 15:59:43 crc kubenswrapper[4717]: I0217 15:59:43.689152 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bjvsr" podStartSLOduration=2.247323552 podStartE2EDuration="4.689134073s" podCreationTimestamp="2026-02-17 15:59:39 +0000 UTC" firstStartedPulling="2026-02-17 15:59:40.634002518 +0000 UTC m=+4047.049843024" lastFinishedPulling="2026-02-17 15:59:43.075813079 +0000 UTC m=+4049.491653545" observedRunningTime="2026-02-17 15:59:43.688869365 +0000 UTC m=+4050.104709861" watchObservedRunningTime="2026-02-17 15:59:43.689134073 +0000 UTC m=+4050.104974549" Feb 17 15:59:49 crc kubenswrapper[4717]: I0217 15:59:49.717613 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:49 crc kubenswrapper[4717]: I0217 15:59:49.718285 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:50 crc kubenswrapper[4717]: I0217 15:59:50.061773 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:50 crc kubenswrapper[4717]: I0217 15:59:50.251094 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/util/0.log" Feb 17 15:59:50 crc kubenswrapper[4717]: I0217 15:59:50.383454 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/util/0.log" Feb 17 15:59:50 crc kubenswrapper[4717]: I0217 15:59:50.396834 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/pull/0.log" Feb 17 15:59:50 crc kubenswrapper[4717]: I0217 15:59:50.436273 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/pull/0.log" Feb 17 15:59:50 crc kubenswrapper[4717]: I0217 15:59:50.579481 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/pull/0.log" Feb 17 15:59:50 crc kubenswrapper[4717]: I0217 15:59:50.614707 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/util/0.log" Feb 17 15:59:50 crc kubenswrapper[4717]: I0217 15:59:50.629129 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5a61c443b84ef04991718f2e6c1e306d5a63937461d26e7de9bd4064b1fmlmb_af730b93-e04c-4361-9e27-67ed0596569a/extract/0.log" Feb 17 15:59:50 crc kubenswrapper[4717]: I0217 15:59:50.768398 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:50 crc kubenswrapper[4717]: I0217 15:59:50.814946 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjvsr"] Feb 17 15:59:51 crc kubenswrapper[4717]: I0217 15:59:51.104880 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-5ddqf_55e71e0a-9623-4049-b828-77040b5dd36e/manager/0.log" Feb 17 15:59:51 crc kubenswrapper[4717]: I0217 15:59:51.427252 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-qdg4j_0004ea51-4233-47ad-a9d9-8e5d745a55f8/manager/0.log" Feb 17 15:59:51 crc kubenswrapper[4717]: I0217 15:59:51.821963 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-5d768_c12ec16a-c8fd-48ae-8c86-257bcef97050/manager/0.log" Feb 17 15:59:52 crc kubenswrapper[4717]: I0217 15:59:52.001924 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-9kp5d_9bc8e53b-549b-48d9-810f-25ce640b7339/manager/0.log" Feb 17 15:59:52 crc kubenswrapper[4717]: I0217 15:59:52.513130 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-lcth2_5390fc8f-fcef-4a64-8d3c-4ba8e9b2f86d/manager/0.log" Feb 17 15:59:52 crc kubenswrapper[4717]: I0217 15:59:52.665391 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-d6ssr_9aa57429-09b8-4262-8356-fd8ea486b236/manager/0.log" Feb 17 15:59:52 crc kubenswrapper[4717]: I0217 15:59:52.746417 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bjvsr" podUID="70b315d7-5138-42f3-8d6f-0e06ac2ede33" containerName="registry-server" containerID="cri-o://68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8" gracePeriod=2 Feb 17 15:59:52 crc kubenswrapper[4717]: I0217 15:59:52.932058 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-5rhm9_aee98c41-a3b5-43ba-b272-279e5836df0b/manager/0.log" Feb 17 15:59:52 crc kubenswrapper[4717]: I0217 15:59:52.955061 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-d7sx6_6edcc364-656f-4f2d-aa9a-3409b3b58471/manager/0.log" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.158221 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-dtxn5_514fed4f-53b1-4b52-8b25-7e4ec648e155/manager/0.log" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.256695 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.391792 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-s6xjq_563dc2c2-73a1-485a-ab9e-6f7e0b3423cb/manager/0.log" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.434422 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b315d7-5138-42f3-8d6f-0e06ac2ede33-catalog-content\") pod \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\" (UID: \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\") " Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.434495 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b315d7-5138-42f3-8d6f-0e06ac2ede33-utilities\") pod \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\" (UID: \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\") " Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.434637 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7vxq\" (UniqueName: \"kubernetes.io/projected/70b315d7-5138-42f3-8d6f-0e06ac2ede33-kube-api-access-n7vxq\") pod \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\" (UID: \"70b315d7-5138-42f3-8d6f-0e06ac2ede33\") " Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.435414 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70b315d7-5138-42f3-8d6f-0e06ac2ede33-utilities" (OuterVolumeSpecName: "utilities") pod "70b315d7-5138-42f3-8d6f-0e06ac2ede33" (UID: "70b315d7-5138-42f3-8d6f-0e06ac2ede33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.445449 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70b315d7-5138-42f3-8d6f-0e06ac2ede33-kube-api-access-n7vxq" (OuterVolumeSpecName: "kube-api-access-n7vxq") pod "70b315d7-5138-42f3-8d6f-0e06ac2ede33" (UID: "70b315d7-5138-42f3-8d6f-0e06ac2ede33"). InnerVolumeSpecName "kube-api-access-n7vxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.474680 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70b315d7-5138-42f3-8d6f-0e06ac2ede33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70b315d7-5138-42f3-8d6f-0e06ac2ede33" (UID: "70b315d7-5138-42f3-8d6f-0e06ac2ede33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.480408 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-gmfn2_22accf0b-8a4d-478a-bc45-d5bd4aa45b87/manager/0.log" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.536272 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7vxq\" (UniqueName: \"kubernetes.io/projected/70b315d7-5138-42f3-8d6f-0e06ac2ede33-kube-api-access-n7vxq\") on node \"crc\" DevicePath \"\"" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.536300 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b315d7-5138-42f3-8d6f-0e06ac2ede33-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.536311 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b315d7-5138-42f3-8d6f-0e06ac2ede33-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.727296 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-22tfq_3a97433f-8ce2-446e-92ef-170a4996ffe8/manager/0.log" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.757426 4717 generic.go:334] "Generic (PLEG): container finished" podID="70b315d7-5138-42f3-8d6f-0e06ac2ede33" containerID="68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8" exitCode=0 Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.757465 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjvsr" event={"ID":"70b315d7-5138-42f3-8d6f-0e06ac2ede33","Type":"ContainerDied","Data":"68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8"} Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.757494 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjvsr" event={"ID":"70b315d7-5138-42f3-8d6f-0e06ac2ede33","Type":"ContainerDied","Data":"a710b70ea459507f73c8b67db303b6e4dbf6950390196325786e0404931a495c"} Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.757512 4717 scope.go:117] "RemoveContainer" containerID="68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.757524 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjvsr" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.774542 4717 scope.go:117] "RemoveContainer" containerID="72c2626867af1a28ef81b30ac52009f11f854d207e1f94583affd355799ddad4" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.800147 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjvsr"] Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.808912 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjvsr"] Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.822342 4717 scope.go:117] "RemoveContainer" containerID="0722dd950c329f1f3b123e6ae12b4ae1efaf8660984acd1f9ffe4bb533956aff" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.855789 4717 scope.go:117] "RemoveContainer" containerID="68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8" Feb 17 15:59:53 crc kubenswrapper[4717]: E0217 15:59:53.869382 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8\": container with ID starting with 68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8 not found: ID does not exist" containerID="68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.869427 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8"} err="failed to get container status \"68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8\": rpc error: code = NotFound desc = could not find container \"68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8\": container with ID starting with 68a9b670ebbeb749f8fd2ee62dba9e1f1acebda45adeab6b48378a94f049eee8 not found: ID does not exist" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.869461 4717 scope.go:117] "RemoveContainer" containerID="72c2626867af1a28ef81b30ac52009f11f854d207e1f94583affd355799ddad4" Feb 17 15:59:53 crc kubenswrapper[4717]: E0217 15:59:53.869953 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72c2626867af1a28ef81b30ac52009f11f854d207e1f94583affd355799ddad4\": container with ID starting with 72c2626867af1a28ef81b30ac52009f11f854d207e1f94583affd355799ddad4 not found: ID does not exist" containerID="72c2626867af1a28ef81b30ac52009f11f854d207e1f94583affd355799ddad4" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.869996 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72c2626867af1a28ef81b30ac52009f11f854d207e1f94583affd355799ddad4"} err="failed to get container status \"72c2626867af1a28ef81b30ac52009f11f854d207e1f94583affd355799ddad4\": rpc error: code = NotFound desc = could not find container \"72c2626867af1a28ef81b30ac52009f11f854d207e1f94583affd355799ddad4\": container with ID starting with 72c2626867af1a28ef81b30ac52009f11f854d207e1f94583affd355799ddad4 not found: ID does not exist" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.870025 4717 scope.go:117] "RemoveContainer" containerID="0722dd950c329f1f3b123e6ae12b4ae1efaf8660984acd1f9ffe4bb533956aff" Feb 17 15:59:53 crc kubenswrapper[4717]: E0217 15:59:53.870361 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0722dd950c329f1f3b123e6ae12b4ae1efaf8660984acd1f9ffe4bb533956aff\": container with ID starting with 0722dd950c329f1f3b123e6ae12b4ae1efaf8660984acd1f9ffe4bb533956aff not found: ID does not exist" containerID="0722dd950c329f1f3b123e6ae12b4ae1efaf8660984acd1f9ffe4bb533956aff" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.870381 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0722dd950c329f1f3b123e6ae12b4ae1efaf8660984acd1f9ffe4bb533956aff"} err="failed to get container status \"0722dd950c329f1f3b123e6ae12b4ae1efaf8660984acd1f9ffe4bb533956aff\": rpc error: code = NotFound desc = could not find container \"0722dd950c329f1f3b123e6ae12b4ae1efaf8660984acd1f9ffe4bb533956aff\": container with ID starting with 0722dd950c329f1f3b123e6ae12b4ae1efaf8660984acd1f9ffe4bb533956aff not found: ID does not exist" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.873014 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70b315d7-5138-42f3-8d6f-0e06ac2ede33" path="/var/lib/kubelet/pods/70b315d7-5138-42f3-8d6f-0e06ac2ede33/volumes" Feb 17 15:59:53 crc kubenswrapper[4717]: I0217 15:59:53.914187 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cm8796_b4295f2b-c9b6-4604-b910-525c07cca2ed/manager/0.log" Feb 17 15:59:54 crc kubenswrapper[4717]: I0217 15:59:54.307513 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7948dfdc59-ljckc_ece1adcd-6e4d-4cf5-afd4-d108db8df6d6/operator/0.log" Feb 17 15:59:54 crc kubenswrapper[4717]: I0217 15:59:54.492043 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jpq9r_e69d57ba-48b1-48d9-b658-c0a86cf05ab4/registry-server/0.log" Feb 17 15:59:54 crc kubenswrapper[4717]: I0217 15:59:54.767257 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-x6nmz_8a4392b6-6232-4135-91f9-676c565446fc/manager/0.log" Feb 17 15:59:54 crc kubenswrapper[4717]: I0217 15:59:54.914375 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-hzzpc_ae48b931-1ac6-43ed-a407-b3fbb3d56178/manager/0.log" Feb 17 15:59:55 crc kubenswrapper[4717]: I0217 15:59:55.121774 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-2w74d_dd3497e6-6de6-4bdf-a23e-16adc21de6ab/operator/0.log" Feb 17 15:59:55 crc kubenswrapper[4717]: I0217 15:59:55.397930 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-j9tsj_2872debe-d42d-4955-bcca-5006aa7a2ecc/manager/0.log" Feb 17 15:59:55 crc kubenswrapper[4717]: I0217 15:59:55.854433 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 15:59:55 crc kubenswrapper[4717]: E0217 15:59:55.854657 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 15:59:55 crc kubenswrapper[4717]: I0217 15:59:55.905273 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-spbft_af8d050d-c4ea-4ff3-9a36-7d5b88b5d6cb/manager/0.log" Feb 17 15:59:55 crc kubenswrapper[4717]: I0217 15:59:55.937048 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-zlkwq_aaf337fd-63ab-42de-a465-02fecc40116b/manager/0.log" Feb 17 15:59:56 crc kubenswrapper[4717]: I0217 15:59:56.105297 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-jq8zs_5fdc2829-32d7-456a-99f3-e15b957b272e/manager/0.log" Feb 17 15:59:56 crc kubenswrapper[4717]: I0217 15:59:56.238595 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7d87bc949d-9mr4h_f2279bdf-746c-4e8c-8703-74a256bd7923/manager/0.log" Feb 17 15:59:56 crc kubenswrapper[4717]: I0217 15:59:56.490763 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-mspl4_3dcfb2d1-e765-4eb0-8300-f7567a34cae7/manager/0.log" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.171962 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd"] Feb 17 16:00:00 crc kubenswrapper[4717]: E0217 16:00:00.172662 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b315d7-5138-42f3-8d6f-0e06ac2ede33" containerName="registry-server" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.172674 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b315d7-5138-42f3-8d6f-0e06ac2ede33" containerName="registry-server" Feb 17 16:00:00 crc kubenswrapper[4717]: E0217 16:00:00.172686 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b315d7-5138-42f3-8d6f-0e06ac2ede33" containerName="extract-content" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.172693 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b315d7-5138-42f3-8d6f-0e06ac2ede33" containerName="extract-content" Feb 17 16:00:00 crc kubenswrapper[4717]: E0217 16:00:00.172714 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b315d7-5138-42f3-8d6f-0e06ac2ede33" containerName="extract-utilities" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.172720 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b315d7-5138-42f3-8d6f-0e06ac2ede33" containerName="extract-utilities" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.172888 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="70b315d7-5138-42f3-8d6f-0e06ac2ede33" containerName="registry-server" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.173507 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.174961 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.175196 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.184689 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd"] Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.360415 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnmzl\" (UniqueName: \"kubernetes.io/projected/3e440b30-98a6-4302-a10d-ac7a99ca6945-kube-api-access-jnmzl\") pod \"collect-profiles-29522400-kgrrd\" (UID: \"3e440b30-98a6-4302-a10d-ac7a99ca6945\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.360696 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e440b30-98a6-4302-a10d-ac7a99ca6945-config-volume\") pod \"collect-profiles-29522400-kgrrd\" (UID: \"3e440b30-98a6-4302-a10d-ac7a99ca6945\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.360771 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e440b30-98a6-4302-a10d-ac7a99ca6945-secret-volume\") pod \"collect-profiles-29522400-kgrrd\" (UID: \"3e440b30-98a6-4302-a10d-ac7a99ca6945\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.462058 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnmzl\" (UniqueName: \"kubernetes.io/projected/3e440b30-98a6-4302-a10d-ac7a99ca6945-kube-api-access-jnmzl\") pod \"collect-profiles-29522400-kgrrd\" (UID: \"3e440b30-98a6-4302-a10d-ac7a99ca6945\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.462157 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e440b30-98a6-4302-a10d-ac7a99ca6945-config-volume\") pod \"collect-profiles-29522400-kgrrd\" (UID: \"3e440b30-98a6-4302-a10d-ac7a99ca6945\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.462262 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e440b30-98a6-4302-a10d-ac7a99ca6945-secret-volume\") pod \"collect-profiles-29522400-kgrrd\" (UID: \"3e440b30-98a6-4302-a10d-ac7a99ca6945\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.463097 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e440b30-98a6-4302-a10d-ac7a99ca6945-config-volume\") pod \"collect-profiles-29522400-kgrrd\" (UID: \"3e440b30-98a6-4302-a10d-ac7a99ca6945\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.469297 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e440b30-98a6-4302-a10d-ac7a99ca6945-secret-volume\") pod \"collect-profiles-29522400-kgrrd\" (UID: \"3e440b30-98a6-4302-a10d-ac7a99ca6945\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.477044 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnmzl\" (UniqueName: \"kubernetes.io/projected/3e440b30-98a6-4302-a10d-ac7a99ca6945-kube-api-access-jnmzl\") pod \"collect-profiles-29522400-kgrrd\" (UID: \"3e440b30-98a6-4302-a10d-ac7a99ca6945\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.504665 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:00 crc kubenswrapper[4717]: I0217 16:00:00.953932 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd"] Feb 17 16:00:01 crc kubenswrapper[4717]: I0217 16:00:01.524576 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-fp8nh_cd977519-10c4-4afe-8f51-52f6cab597f9/manager/0.log" Feb 17 16:00:01 crc kubenswrapper[4717]: I0217 16:00:01.847958 4717 generic.go:334] "Generic (PLEG): container finished" podID="3e440b30-98a6-4302-a10d-ac7a99ca6945" containerID="67749b9c7e2cb2385c71153561346e53b3d74ca855e22f4a7e92385e9e631800" exitCode=0 Feb 17 16:00:01 crc kubenswrapper[4717]: I0217 16:00:01.854022 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" event={"ID":"3e440b30-98a6-4302-a10d-ac7a99ca6945","Type":"ContainerDied","Data":"67749b9c7e2cb2385c71153561346e53b3d74ca855e22f4a7e92385e9e631800"} Feb 17 16:00:01 crc kubenswrapper[4717]: I0217 16:00:01.854059 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" event={"ID":"3e440b30-98a6-4302-a10d-ac7a99ca6945","Type":"ContainerStarted","Data":"e7326d832277004bb116b68a1aabfdc56c0e13615f6c9897efa7e374c4da766c"} Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.267015 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.340486 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e440b30-98a6-4302-a10d-ac7a99ca6945-secret-volume\") pod \"3e440b30-98a6-4302-a10d-ac7a99ca6945\" (UID: \"3e440b30-98a6-4302-a10d-ac7a99ca6945\") " Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.340846 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e440b30-98a6-4302-a10d-ac7a99ca6945-config-volume\") pod \"3e440b30-98a6-4302-a10d-ac7a99ca6945\" (UID: \"3e440b30-98a6-4302-a10d-ac7a99ca6945\") " Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.340890 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnmzl\" (UniqueName: \"kubernetes.io/projected/3e440b30-98a6-4302-a10d-ac7a99ca6945-kube-api-access-jnmzl\") pod \"3e440b30-98a6-4302-a10d-ac7a99ca6945\" (UID: \"3e440b30-98a6-4302-a10d-ac7a99ca6945\") " Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.341477 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e440b30-98a6-4302-a10d-ac7a99ca6945-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e440b30-98a6-4302-a10d-ac7a99ca6945" (UID: "3e440b30-98a6-4302-a10d-ac7a99ca6945"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.347786 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e440b30-98a6-4302-a10d-ac7a99ca6945-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3e440b30-98a6-4302-a10d-ac7a99ca6945" (UID: "3e440b30-98a6-4302-a10d-ac7a99ca6945"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.364101 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e440b30-98a6-4302-a10d-ac7a99ca6945-kube-api-access-jnmzl" (OuterVolumeSpecName: "kube-api-access-jnmzl") pod "3e440b30-98a6-4302-a10d-ac7a99ca6945" (UID: "3e440b30-98a6-4302-a10d-ac7a99ca6945"). InnerVolumeSpecName "kube-api-access-jnmzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.443065 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e440b30-98a6-4302-a10d-ac7a99ca6945-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.443332 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e440b30-98a6-4302-a10d-ac7a99ca6945-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.443401 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnmzl\" (UniqueName: \"kubernetes.io/projected/3e440b30-98a6-4302-a10d-ac7a99ca6945-kube-api-access-jnmzl\") on node \"crc\" DevicePath \"\"" Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.869670 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" event={"ID":"3e440b30-98a6-4302-a10d-ac7a99ca6945","Type":"ContainerDied","Data":"e7326d832277004bb116b68a1aabfdc56c0e13615f6c9897efa7e374c4da766c"} Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.869707 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522400-kgrrd" Feb 17 16:00:03 crc kubenswrapper[4717]: I0217 16:00:03.869714 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7326d832277004bb116b68a1aabfdc56c0e13615f6c9897efa7e374c4da766c" Feb 17 16:00:04 crc kubenswrapper[4717]: I0217 16:00:04.353017 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d"] Feb 17 16:00:04 crc kubenswrapper[4717]: I0217 16:00:04.364115 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522355-hcz2d"] Feb 17 16:00:05 crc kubenswrapper[4717]: I0217 16:00:05.865502 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5" path="/var/lib/kubelet/pods/6f2a2c42-47b5-4edf-96e0-b8ba5dec3ad5/volumes" Feb 17 16:00:07 crc kubenswrapper[4717]: I0217 16:00:07.168352 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 16:00:07 crc kubenswrapper[4717]: E0217 16:00:07.168634 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 16:00:13 crc kubenswrapper[4717]: I0217 16:00:13.271954 4717 scope.go:117] "RemoveContainer" containerID="4d78184bce8428bd71c74d9d6a6e9fa0d18a756b619038b235ecc71ad48eb1a8" Feb 17 16:00:18 crc kubenswrapper[4717]: I0217 16:00:18.846718 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 16:00:18 crc kubenswrapper[4717]: E0217 16:00:18.847587 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 16:00:21 crc kubenswrapper[4717]: I0217 16:00:21.162666 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-vxkqh_f576665e-17f5-4704-bd20-5debf9fb8612/control-plane-machine-set-operator/0.log" Feb 17 16:00:22 crc kubenswrapper[4717]: I0217 16:00:22.103221 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9t64v_ba04adcf-e228-4486-a74a-e9846bdaa53f/machine-api-operator/0.log" Feb 17 16:00:22 crc kubenswrapper[4717]: I0217 16:00:22.152422 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9t64v_ba04adcf-e228-4486-a74a-e9846bdaa53f/kube-rbac-proxy/0.log" Feb 17 16:00:33 crc kubenswrapper[4717]: I0217 16:00:33.856188 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 16:00:33 crc kubenswrapper[4717]: E0217 16:00:33.857128 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 16:00:36 crc kubenswrapper[4717]: I0217 16:00:36.351257 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-kfbxb_448f5725-7b22-4ceb-8c7f-f1e8883d423b/cert-manager-controller/0.log" Feb 17 16:00:36 crc kubenswrapper[4717]: I0217 16:00:36.484477 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n2cmk_52e5e91c-8239-4ffb-bc9a-a5dc8780f6e3/cert-manager-cainjector/0.log" Feb 17 16:00:36 crc kubenswrapper[4717]: I0217 16:00:36.545794 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-n82c5_04796f6f-6755-4e30-9fec-7003006dd113/cert-manager-webhook/0.log" Feb 17 16:00:48 crc kubenswrapper[4717]: I0217 16:00:48.846520 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 16:00:48 crc kubenswrapper[4717]: E0217 16:00:48.847311 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 16:00:50 crc kubenswrapper[4717]: I0217 16:00:50.082107 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-7rlqn_303af50d-ffd5-4a3b-9020-b96bdcddc135/nmstate-console-plugin/0.log" Feb 17 16:00:50 crc kubenswrapper[4717]: I0217 16:00:50.284210 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-kcx4j_6c630162-18cc-4f15-80f4-2b1ad7a2e87c/nmstate-handler/0.log" Feb 17 16:00:50 crc kubenswrapper[4717]: I0217 16:00:50.381448 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-r6698_28ecae2c-05f9-43e9-ad1a-581b4d6e8fea/kube-rbac-proxy/0.log" Feb 17 16:00:50 crc kubenswrapper[4717]: I0217 16:00:50.429931 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-r6698_28ecae2c-05f9-43e9-ad1a-581b4d6e8fea/nmstate-metrics/0.log" Feb 17 16:00:50 crc kubenswrapper[4717]: I0217 16:00:50.588234 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-xgp5x_05310076-5092-4692-a1a2-69826306ea88/nmstate-operator/0.log" Feb 17 16:00:50 crc kubenswrapper[4717]: I0217 16:00:50.628956 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-pxrw7_005457ad-adeb-49ab-aaf7-6043dc2b6021/nmstate-webhook/0.log" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.200286 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29522401-q2fpv"] Feb 17 16:01:00 crc kubenswrapper[4717]: E0217 16:01:00.201195 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e440b30-98a6-4302-a10d-ac7a99ca6945" containerName="collect-profiles" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.201207 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e440b30-98a6-4302-a10d-ac7a99ca6945" containerName="collect-profiles" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.201391 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e440b30-98a6-4302-a10d-ac7a99ca6945" containerName="collect-profiles" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.201978 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.235131 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29522401-q2fpv"] Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.310034 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k28st\" (UniqueName: \"kubernetes.io/projected/13032a81-a6e8-4249-b0ec-7958e1788b80-kube-api-access-k28st\") pod \"keystone-cron-29522401-q2fpv\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.310361 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-config-data\") pod \"keystone-cron-29522401-q2fpv\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.310441 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-combined-ca-bundle\") pod \"keystone-cron-29522401-q2fpv\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.310507 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-fernet-keys\") pod \"keystone-cron-29522401-q2fpv\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.411537 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-combined-ca-bundle\") pod \"keystone-cron-29522401-q2fpv\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.412349 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-fernet-keys\") pod \"keystone-cron-29522401-q2fpv\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.412413 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k28st\" (UniqueName: \"kubernetes.io/projected/13032a81-a6e8-4249-b0ec-7958e1788b80-kube-api-access-k28st\") pod \"keystone-cron-29522401-q2fpv\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.412448 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-config-data\") pod \"keystone-cron-29522401-q2fpv\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.417457 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-combined-ca-bundle\") pod \"keystone-cron-29522401-q2fpv\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.417744 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-fernet-keys\") pod \"keystone-cron-29522401-q2fpv\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.418156 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-config-data\") pod \"keystone-cron-29522401-q2fpv\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.435959 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k28st\" (UniqueName: \"kubernetes.io/projected/13032a81-a6e8-4249-b0ec-7958e1788b80-kube-api-access-k28st\") pod \"keystone-cron-29522401-q2fpv\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:00 crc kubenswrapper[4717]: I0217 16:01:00.535517 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:01 crc kubenswrapper[4717]: I0217 16:01:01.029055 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29522401-q2fpv"] Feb 17 16:01:01 crc kubenswrapper[4717]: I0217 16:01:01.752544 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522401-q2fpv" event={"ID":"13032a81-a6e8-4249-b0ec-7958e1788b80","Type":"ContainerStarted","Data":"b301c9f2c0485166d1dd442a8ad939c6964b65d728ea28258ccf77c56237cb66"} Feb 17 16:01:01 crc kubenswrapper[4717]: I0217 16:01:01.752874 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522401-q2fpv" event={"ID":"13032a81-a6e8-4249-b0ec-7958e1788b80","Type":"ContainerStarted","Data":"04778ca34ada9f5c1aba49b519b2d0f3701c8ccc6e194cd49cb352e5ea62024c"} Feb 17 16:01:01 crc kubenswrapper[4717]: I0217 16:01:01.775249 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29522401-q2fpv" podStartSLOduration=1.775231095 podStartE2EDuration="1.775231095s" podCreationTimestamp="2026-02-17 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:01:01.774873545 +0000 UTC m=+4128.190714031" watchObservedRunningTime="2026-02-17 16:01:01.775231095 +0000 UTC m=+4128.191071571" Feb 17 16:01:01 crc kubenswrapper[4717]: I0217 16:01:01.847190 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 16:01:01 crc kubenswrapper[4717]: E0217 16:01:01.847488 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 16:01:03 crc kubenswrapper[4717]: I0217 16:01:03.784679 4717 generic.go:334] "Generic (PLEG): container finished" podID="13032a81-a6e8-4249-b0ec-7958e1788b80" containerID="b301c9f2c0485166d1dd442a8ad939c6964b65d728ea28258ccf77c56237cb66" exitCode=0 Feb 17 16:01:03 crc kubenswrapper[4717]: I0217 16:01:03.784780 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522401-q2fpv" event={"ID":"13032a81-a6e8-4249-b0ec-7958e1788b80","Type":"ContainerDied","Data":"b301c9f2c0485166d1dd442a8ad939c6964b65d728ea28258ccf77c56237cb66"} Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.160347 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.307463 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-fernet-keys\") pod \"13032a81-a6e8-4249-b0ec-7958e1788b80\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.307572 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-config-data\") pod \"13032a81-a6e8-4249-b0ec-7958e1788b80\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.307627 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k28st\" (UniqueName: \"kubernetes.io/projected/13032a81-a6e8-4249-b0ec-7958e1788b80-kube-api-access-k28st\") pod \"13032a81-a6e8-4249-b0ec-7958e1788b80\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.307650 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-combined-ca-bundle\") pod \"13032a81-a6e8-4249-b0ec-7958e1788b80\" (UID: \"13032a81-a6e8-4249-b0ec-7958e1788b80\") " Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.313564 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13032a81-a6e8-4249-b0ec-7958e1788b80-kube-api-access-k28st" (OuterVolumeSpecName: "kube-api-access-k28st") pod "13032a81-a6e8-4249-b0ec-7958e1788b80" (UID: "13032a81-a6e8-4249-b0ec-7958e1788b80"). InnerVolumeSpecName "kube-api-access-k28st". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.313861 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "13032a81-a6e8-4249-b0ec-7958e1788b80" (UID: "13032a81-a6e8-4249-b0ec-7958e1788b80"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.387922 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13032a81-a6e8-4249-b0ec-7958e1788b80" (UID: "13032a81-a6e8-4249-b0ec-7958e1788b80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.408203 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-config-data" (OuterVolumeSpecName: "config-data") pod "13032a81-a6e8-4249-b0ec-7958e1788b80" (UID: "13032a81-a6e8-4249-b0ec-7958e1788b80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.409314 4717 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.409339 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.409349 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k28st\" (UniqueName: \"kubernetes.io/projected/13032a81-a6e8-4249-b0ec-7958e1788b80-kube-api-access-k28st\") on node \"crc\" DevicePath \"\"" Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.409360 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13032a81-a6e8-4249-b0ec-7958e1788b80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.807389 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522401-q2fpv" event={"ID":"13032a81-a6e8-4249-b0ec-7958e1788b80","Type":"ContainerDied","Data":"04778ca34ada9f5c1aba49b519b2d0f3701c8ccc6e194cd49cb352e5ea62024c"} Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.807635 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04778ca34ada9f5c1aba49b519b2d0f3701c8ccc6e194cd49cb352e5ea62024c" Feb 17 16:01:05 crc kubenswrapper[4717]: I0217 16:01:05.807694 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522401-q2fpv" Feb 17 16:01:15 crc kubenswrapper[4717]: I0217 16:01:15.846799 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 16:01:15 crc kubenswrapper[4717]: E0217 16:01:15.847583 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 16:01:21 crc kubenswrapper[4717]: I0217 16:01:21.155700 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7k7sc_72e0c267-99f7-4d36-83e8-219560a63667/kube-rbac-proxy/0.log" Feb 17 16:01:21 crc kubenswrapper[4717]: I0217 16:01:21.190051 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7k7sc_72e0c267-99f7-4d36-83e8-219560a63667/controller/0.log" Feb 17 16:01:21 crc kubenswrapper[4717]: I0217 16:01:21.834218 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-frr-files/0.log" Feb 17 16:01:21 crc kubenswrapper[4717]: I0217 16:01:21.955672 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-frr-files/0.log" Feb 17 16:01:21 crc kubenswrapper[4717]: I0217 16:01:21.970256 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-reloader/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.033592 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-metrics/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.077539 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-reloader/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.197548 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-frr-files/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.204193 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-reloader/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.212198 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-metrics/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.253357 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-metrics/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.434796 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-frr-files/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.446963 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-metrics/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.451972 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/cp-reloader/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.522869 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/controller/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.654353 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/frr-metrics/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.693477 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/kube-rbac-proxy/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.788764 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/kube-rbac-proxy-frr/0.log" Feb 17 16:01:22 crc kubenswrapper[4717]: I0217 16:01:22.884769 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/reloader/0.log" Feb 17 16:01:23 crc kubenswrapper[4717]: I0217 16:01:23.055971 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-8f7bl_acf9f85d-e0ff-419b-8b56-ceda8ffeb28a/frr-k8s-webhook-server/0.log" Feb 17 16:01:23 crc kubenswrapper[4717]: I0217 16:01:23.401116 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-56fd4fb65-pddnn_5b9d38f5-133a-43ed-bfef-8e5a27fa200c/webhook-server/0.log" Feb 17 16:01:23 crc kubenswrapper[4717]: I0217 16:01:23.457849 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-774c978687-jj472_656da445-df5b-402b-8347-70aa45a92159/manager/0.log" Feb 17 16:01:23 crc kubenswrapper[4717]: I0217 16:01:23.620511 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vc7q2_0179ef66-9a6c-440d-8606-f2f040fe7b44/kube-rbac-proxy/0.log" Feb 17 16:01:24 crc kubenswrapper[4717]: I0217 16:01:24.145843 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-njs4q_fbd1bfd2-2f7b-4bf7-9690-e0a31a3b5321/frr/0.log" Feb 17 16:01:24 crc kubenswrapper[4717]: I0217 16:01:24.152803 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vc7q2_0179ef66-9a6c-440d-8606-f2f040fe7b44/speaker/0.log" Feb 17 16:01:30 crc kubenswrapper[4717]: I0217 16:01:30.847531 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 16:01:32 crc kubenswrapper[4717]: I0217 16:01:32.077536 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"6d1eca0fe5740b1a6f347e856332dd2fb03e726854f088ec4724d465bf64a29e"} Feb 17 16:01:37 crc kubenswrapper[4717]: I0217 16:01:37.978383 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/util/0.log" Feb 17 16:01:38 crc kubenswrapper[4717]: I0217 16:01:38.157583 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/pull/0.log" Feb 17 16:01:38 crc kubenswrapper[4717]: I0217 16:01:38.173285 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/pull/0.log" Feb 17 16:01:38 crc kubenswrapper[4717]: I0217 16:01:38.184738 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/util/0.log" Feb 17 16:01:38 crc kubenswrapper[4717]: I0217 16:01:38.337276 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/util/0.log" Feb 17 16:01:38 crc kubenswrapper[4717]: I0217 16:01:38.337961 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/extract/0.log" Feb 17 16:01:38 crc kubenswrapper[4717]: I0217 16:01:38.342694 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ncm4m_d2c7425a-c6a2-4c42-b731-a24715c81039/pull/0.log" Feb 17 16:01:38 crc kubenswrapper[4717]: I0217 16:01:38.495958 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/extract-utilities/0.log" Feb 17 16:01:38 crc kubenswrapper[4717]: I0217 16:01:38.639466 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/extract-utilities/0.log" Feb 17 16:01:38 crc kubenswrapper[4717]: I0217 16:01:38.661865 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/extract-content/0.log" Feb 17 16:01:38 crc kubenswrapper[4717]: I0217 16:01:38.682605 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/extract-content/0.log" Feb 17 16:01:38 crc kubenswrapper[4717]: I0217 16:01:38.849212 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/extract-content/0.log" Feb 17 16:01:38 crc kubenswrapper[4717]: I0217 16:01:38.908193 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/extract-utilities/0.log" Feb 17 16:01:39 crc kubenswrapper[4717]: I0217 16:01:39.018228 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/extract-utilities/0.log" Feb 17 16:01:39 crc kubenswrapper[4717]: I0217 16:01:39.241560 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/extract-content/0.log" Feb 17 16:01:39 crc kubenswrapper[4717]: I0217 16:01:39.282455 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/extract-utilities/0.log" Feb 17 16:01:39 crc kubenswrapper[4717]: I0217 16:01:39.353027 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/extract-content/0.log" Feb 17 16:01:39 crc kubenswrapper[4717]: I0217 16:01:39.518365 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/extract-content/0.log" Feb 17 16:01:39 crc kubenswrapper[4717]: I0217 16:01:39.546071 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b4kxp_43eecc72-1d78-474c-b7c3-a773e72acb6a/registry-server/0.log" Feb 17 16:01:39 crc kubenswrapper[4717]: I0217 16:01:39.560469 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/extract-utilities/0.log" Feb 17 16:01:39 crc kubenswrapper[4717]: I0217 16:01:39.763964 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/util/0.log" Feb 17 16:01:39 crc kubenswrapper[4717]: I0217 16:01:39.949542 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/util/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.003928 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/pull/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.027652 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/pull/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.147008 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rb6w5_46c61af5-321a-4195-8d6b-a95c65c5eae3/registry-server/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.211594 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/util/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.251909 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/extract/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.261283 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecasmmdb_3a5405a7-d00d-43ec-b3c4-ac0e52626876/pull/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.426691 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-njsvg_813c7436-6b2f-45ed-8fc8-d400f00c80fd/marketplace-operator/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.445006 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/extract-utilities/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.660798 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/extract-utilities/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.671225 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/extract-content/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.681533 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/extract-content/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.825699 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/extract-utilities/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.866139 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/extract-content/0.log" Feb 17 16:01:40 crc kubenswrapper[4717]: I0217 16:01:40.989991 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jk5wb_c636add4-d170-4295-8eeb-3f4972cf20d0/registry-server/0.log" Feb 17 16:01:41 crc kubenswrapper[4717]: I0217 16:01:41.002848 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/extract-utilities/0.log" Feb 17 16:01:41 crc kubenswrapper[4717]: I0217 16:01:41.204418 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/extract-utilities/0.log" Feb 17 16:01:41 crc kubenswrapper[4717]: I0217 16:01:41.211477 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/extract-content/0.log" Feb 17 16:01:41 crc kubenswrapper[4717]: I0217 16:01:41.215875 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/extract-content/0.log" Feb 17 16:01:41 crc kubenswrapper[4717]: I0217 16:01:41.456123 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/extract-content/0.log" Feb 17 16:01:41 crc kubenswrapper[4717]: I0217 16:01:41.458288 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/extract-utilities/0.log" Feb 17 16:01:41 crc kubenswrapper[4717]: I0217 16:01:41.600094 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cxbg6_454d2d52-d8e3-4069-8b17-36622c4f7561/registry-server/0.log" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.147234 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ml2bb"] Feb 17 16:02:22 crc kubenswrapper[4717]: E0217 16:02:22.148446 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13032a81-a6e8-4249-b0ec-7958e1788b80" containerName="keystone-cron" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.148469 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="13032a81-a6e8-4249-b0ec-7958e1788b80" containerName="keystone-cron" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.148802 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="13032a81-a6e8-4249-b0ec-7958e1788b80" containerName="keystone-cron" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.152738 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.158657 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ml2bb"] Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.256441 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mbcf\" (UniqueName: \"kubernetes.io/projected/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-kube-api-access-5mbcf\") pod \"certified-operators-ml2bb\" (UID: \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\") " pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.256613 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-catalog-content\") pod \"certified-operators-ml2bb\" (UID: \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\") " pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.256746 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-utilities\") pod \"certified-operators-ml2bb\" (UID: \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\") " pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.357972 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mbcf\" (UniqueName: \"kubernetes.io/projected/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-kube-api-access-5mbcf\") pod \"certified-operators-ml2bb\" (UID: \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\") " pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.358094 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-catalog-content\") pod \"certified-operators-ml2bb\" (UID: \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\") " pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.358227 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-utilities\") pod \"certified-operators-ml2bb\" (UID: \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\") " pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.358648 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-catalog-content\") pod \"certified-operators-ml2bb\" (UID: \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\") " pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.361244 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-utilities\") pod \"certified-operators-ml2bb\" (UID: \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\") " pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.377743 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mbcf\" (UniqueName: \"kubernetes.io/projected/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-kube-api-access-5mbcf\") pod \"certified-operators-ml2bb\" (UID: \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\") " pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.505467 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:22 crc kubenswrapper[4717]: I0217 16:02:22.979857 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ml2bb"] Feb 17 16:02:23 crc kubenswrapper[4717]: I0217 16:02:23.556317 4717 generic.go:334] "Generic (PLEG): container finished" podID="cd55824d-8ffd-43d9-9a49-7abb1169ed1b" containerID="4782f7b579f5742c31c329f605b134f4b2f39c21d4ed31711f6cbcddfb9672d1" exitCode=0 Feb 17 16:02:23 crc kubenswrapper[4717]: I0217 16:02:23.556405 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml2bb" event={"ID":"cd55824d-8ffd-43d9-9a49-7abb1169ed1b","Type":"ContainerDied","Data":"4782f7b579f5742c31c329f605b134f4b2f39c21d4ed31711f6cbcddfb9672d1"} Feb 17 16:02:23 crc kubenswrapper[4717]: I0217 16:02:23.556842 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml2bb" event={"ID":"cd55824d-8ffd-43d9-9a49-7abb1169ed1b","Type":"ContainerStarted","Data":"bcde6e51b35a65751fd1da01ff7257389a2144ccca1d052b7af39f58c1a72f38"} Feb 17 16:02:24 crc kubenswrapper[4717]: I0217 16:02:24.568247 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml2bb" event={"ID":"cd55824d-8ffd-43d9-9a49-7abb1169ed1b","Type":"ContainerStarted","Data":"c3fd8b3049b93ebe5388ff7cb7c8235b1147329c8c18baac4693807fe4cfa146"} Feb 17 16:02:25 crc kubenswrapper[4717]: I0217 16:02:25.579639 4717 generic.go:334] "Generic (PLEG): container finished" podID="cd55824d-8ffd-43d9-9a49-7abb1169ed1b" containerID="c3fd8b3049b93ebe5388ff7cb7c8235b1147329c8c18baac4693807fe4cfa146" exitCode=0 Feb 17 16:02:25 crc kubenswrapper[4717]: I0217 16:02:25.579808 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml2bb" event={"ID":"cd55824d-8ffd-43d9-9a49-7abb1169ed1b","Type":"ContainerDied","Data":"c3fd8b3049b93ebe5388ff7cb7c8235b1147329c8c18baac4693807fe4cfa146"} Feb 17 16:02:26 crc kubenswrapper[4717]: I0217 16:02:26.591270 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml2bb" event={"ID":"cd55824d-8ffd-43d9-9a49-7abb1169ed1b","Type":"ContainerStarted","Data":"87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9"} Feb 17 16:02:26 crc kubenswrapper[4717]: I0217 16:02:26.610793 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ml2bb" podStartSLOduration=2.158619609 podStartE2EDuration="4.610774073s" podCreationTimestamp="2026-02-17 16:02:22 +0000 UTC" firstStartedPulling="2026-02-17 16:02:23.559821054 +0000 UTC m=+4209.975661560" lastFinishedPulling="2026-02-17 16:02:26.011975508 +0000 UTC m=+4212.427816024" observedRunningTime="2026-02-17 16:02:26.607533931 +0000 UTC m=+4213.023374417" watchObservedRunningTime="2026-02-17 16:02:26.610774073 +0000 UTC m=+4213.026614559" Feb 17 16:02:32 crc kubenswrapper[4717]: I0217 16:02:32.506685 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:32 crc kubenswrapper[4717]: I0217 16:02:32.507286 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:32 crc kubenswrapper[4717]: I0217 16:02:32.923447 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:32 crc kubenswrapper[4717]: I0217 16:02:32.983747 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:33 crc kubenswrapper[4717]: I0217 16:02:33.172066 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ml2bb"] Feb 17 16:02:34 crc kubenswrapper[4717]: I0217 16:02:34.727501 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ml2bb" podUID="cd55824d-8ffd-43d9-9a49-7abb1169ed1b" containerName="registry-server" containerID="cri-o://87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9" gracePeriod=2 Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.262731 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.370908 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-catalog-content\") pod \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\" (UID: \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\") " Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.371376 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-utilities\") pod \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\" (UID: \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\") " Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.372111 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-utilities" (OuterVolumeSpecName: "utilities") pod "cd55824d-8ffd-43d9-9a49-7abb1169ed1b" (UID: "cd55824d-8ffd-43d9-9a49-7abb1169ed1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.371411 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mbcf\" (UniqueName: \"kubernetes.io/projected/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-kube-api-access-5mbcf\") pod \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\" (UID: \"cd55824d-8ffd-43d9-9a49-7abb1169ed1b\") " Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.372595 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.377530 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-kube-api-access-5mbcf" (OuterVolumeSpecName: "kube-api-access-5mbcf") pod "cd55824d-8ffd-43d9-9a49-7abb1169ed1b" (UID: "cd55824d-8ffd-43d9-9a49-7abb1169ed1b"). InnerVolumeSpecName "kube-api-access-5mbcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.461006 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd55824d-8ffd-43d9-9a49-7abb1169ed1b" (UID: "cd55824d-8ffd-43d9-9a49-7abb1169ed1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.474124 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.474151 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mbcf\" (UniqueName: \"kubernetes.io/projected/cd55824d-8ffd-43d9-9a49-7abb1169ed1b-kube-api-access-5mbcf\") on node \"crc\" DevicePath \"\"" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.744592 4717 generic.go:334] "Generic (PLEG): container finished" podID="cd55824d-8ffd-43d9-9a49-7abb1169ed1b" containerID="87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9" exitCode=0 Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.744637 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml2bb" event={"ID":"cd55824d-8ffd-43d9-9a49-7abb1169ed1b","Type":"ContainerDied","Data":"87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9"} Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.744675 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ml2bb" event={"ID":"cd55824d-8ffd-43d9-9a49-7abb1169ed1b","Type":"ContainerDied","Data":"bcde6e51b35a65751fd1da01ff7257389a2144ccca1d052b7af39f58c1a72f38"} Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.744716 4717 scope.go:117] "RemoveContainer" containerID="87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.744743 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ml2bb" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.776158 4717 scope.go:117] "RemoveContainer" containerID="c3fd8b3049b93ebe5388ff7cb7c8235b1147329c8c18baac4693807fe4cfa146" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.812710 4717 scope.go:117] "RemoveContainer" containerID="4782f7b579f5742c31c329f605b134f4b2f39c21d4ed31711f6cbcddfb9672d1" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.815668 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ml2bb"] Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.827327 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ml2bb"] Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.859560 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd55824d-8ffd-43d9-9a49-7abb1169ed1b" path="/var/lib/kubelet/pods/cd55824d-8ffd-43d9-9a49-7abb1169ed1b/volumes" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.878127 4717 scope.go:117] "RemoveContainer" containerID="87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9" Feb 17 16:02:35 crc kubenswrapper[4717]: E0217 16:02:35.878690 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9\": container with ID starting with 87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9 not found: ID does not exist" containerID="87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.878726 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9"} err="failed to get container status \"87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9\": rpc error: code = NotFound desc = could not find container \"87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9\": container with ID starting with 87ab7d80d8e27ba7eeeed64383b30b7025dcb644a18767dd7a01d15ba19332d9 not found: ID does not exist" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.878751 4717 scope.go:117] "RemoveContainer" containerID="c3fd8b3049b93ebe5388ff7cb7c8235b1147329c8c18baac4693807fe4cfa146" Feb 17 16:02:35 crc kubenswrapper[4717]: E0217 16:02:35.879276 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3fd8b3049b93ebe5388ff7cb7c8235b1147329c8c18baac4693807fe4cfa146\": container with ID starting with c3fd8b3049b93ebe5388ff7cb7c8235b1147329c8c18baac4693807fe4cfa146 not found: ID does not exist" containerID="c3fd8b3049b93ebe5388ff7cb7c8235b1147329c8c18baac4693807fe4cfa146" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.879300 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3fd8b3049b93ebe5388ff7cb7c8235b1147329c8c18baac4693807fe4cfa146"} err="failed to get container status \"c3fd8b3049b93ebe5388ff7cb7c8235b1147329c8c18baac4693807fe4cfa146\": rpc error: code = NotFound desc = could not find container \"c3fd8b3049b93ebe5388ff7cb7c8235b1147329c8c18baac4693807fe4cfa146\": container with ID starting with c3fd8b3049b93ebe5388ff7cb7c8235b1147329c8c18baac4693807fe4cfa146 not found: ID does not exist" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.879318 4717 scope.go:117] "RemoveContainer" containerID="4782f7b579f5742c31c329f605b134f4b2f39c21d4ed31711f6cbcddfb9672d1" Feb 17 16:02:35 crc kubenswrapper[4717]: E0217 16:02:35.880027 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4782f7b579f5742c31c329f605b134f4b2f39c21d4ed31711f6cbcddfb9672d1\": container with ID starting with 4782f7b579f5742c31c329f605b134f4b2f39c21d4ed31711f6cbcddfb9672d1 not found: ID does not exist" containerID="4782f7b579f5742c31c329f605b134f4b2f39c21d4ed31711f6cbcddfb9672d1" Feb 17 16:02:35 crc kubenswrapper[4717]: I0217 16:02:35.880074 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4782f7b579f5742c31c329f605b134f4b2f39c21d4ed31711f6cbcddfb9672d1"} err="failed to get container status \"4782f7b579f5742c31c329f605b134f4b2f39c21d4ed31711f6cbcddfb9672d1\": rpc error: code = NotFound desc = could not find container \"4782f7b579f5742c31c329f605b134f4b2f39c21d4ed31711f6cbcddfb9672d1\": container with ID starting with 4782f7b579f5742c31c329f605b134f4b2f39c21d4ed31711f6cbcddfb9672d1 not found: ID does not exist" Feb 17 16:03:29 crc kubenswrapper[4717]: I0217 16:03:29.367168 4717 generic.go:334] "Generic (PLEG): container finished" podID="80cd448a-eadd-444f-a811-03e863a95efb" containerID="7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424" exitCode=0 Feb 17 16:03:29 crc kubenswrapper[4717]: I0217 16:03:29.367292 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hs9kl/must-gather-88x6k" event={"ID":"80cd448a-eadd-444f-a811-03e863a95efb","Type":"ContainerDied","Data":"7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424"} Feb 17 16:03:29 crc kubenswrapper[4717]: I0217 16:03:29.368503 4717 scope.go:117] "RemoveContainer" containerID="7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424" Feb 17 16:03:30 crc kubenswrapper[4717]: I0217 16:03:30.171328 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hs9kl_must-gather-88x6k_80cd448a-eadd-444f-a811-03e863a95efb/gather/0.log" Feb 17 16:03:40 crc kubenswrapper[4717]: I0217 16:03:40.945221 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hs9kl/must-gather-88x6k"] Feb 17 16:03:40 crc kubenswrapper[4717]: I0217 16:03:40.946068 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hs9kl/must-gather-88x6k" podUID="80cd448a-eadd-444f-a811-03e863a95efb" containerName="copy" containerID="cri-o://63dd57bf74f9676080f3c66148fb9a2b71617934d1b4c4ba47d8f7a4087b0f23" gracePeriod=2 Feb 17 16:03:40 crc kubenswrapper[4717]: I0217 16:03:40.962094 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hs9kl/must-gather-88x6k"] Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.491429 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hs9kl_must-gather-88x6k_80cd448a-eadd-444f-a811-03e863a95efb/copy/0.log" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.491988 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/must-gather-88x6k" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.500427 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hs9kl_must-gather-88x6k_80cd448a-eadd-444f-a811-03e863a95efb/copy/0.log" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.500742 4717 generic.go:334] "Generic (PLEG): container finished" podID="80cd448a-eadd-444f-a811-03e863a95efb" containerID="63dd57bf74f9676080f3c66148fb9a2b71617934d1b4c4ba47d8f7a4087b0f23" exitCode=143 Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.500827 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hs9kl/must-gather-88x6k" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.500849 4717 scope.go:117] "RemoveContainer" containerID="63dd57bf74f9676080f3c66148fb9a2b71617934d1b4c4ba47d8f7a4087b0f23" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.527289 4717 scope.go:117] "RemoveContainer" containerID="7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.568109 4717 scope.go:117] "RemoveContainer" containerID="63dd57bf74f9676080f3c66148fb9a2b71617934d1b4c4ba47d8f7a4087b0f23" Feb 17 16:03:41 crc kubenswrapper[4717]: E0217 16:03:41.568446 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63dd57bf74f9676080f3c66148fb9a2b71617934d1b4c4ba47d8f7a4087b0f23\": container with ID starting with 63dd57bf74f9676080f3c66148fb9a2b71617934d1b4c4ba47d8f7a4087b0f23 not found: ID does not exist" containerID="63dd57bf74f9676080f3c66148fb9a2b71617934d1b4c4ba47d8f7a4087b0f23" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.568482 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63dd57bf74f9676080f3c66148fb9a2b71617934d1b4c4ba47d8f7a4087b0f23"} err="failed to get container status \"63dd57bf74f9676080f3c66148fb9a2b71617934d1b4c4ba47d8f7a4087b0f23\": rpc error: code = NotFound desc = could not find container \"63dd57bf74f9676080f3c66148fb9a2b71617934d1b4c4ba47d8f7a4087b0f23\": container with ID starting with 63dd57bf74f9676080f3c66148fb9a2b71617934d1b4c4ba47d8f7a4087b0f23 not found: ID does not exist" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.568509 4717 scope.go:117] "RemoveContainer" containerID="7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424" Feb 17 16:03:41 crc kubenswrapper[4717]: E0217 16:03:41.568937 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424\": container with ID starting with 7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424 not found: ID does not exist" containerID="7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.568963 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424"} err="failed to get container status \"7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424\": rpc error: code = NotFound desc = could not find container \"7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424\": container with ID starting with 7d0f90c5cfcfa57e8514b230881b75ffe92d82a6bfac4e062994cc65b072a424 not found: ID does not exist" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.599941 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/80cd448a-eadd-444f-a811-03e863a95efb-must-gather-output\") pod \"80cd448a-eadd-444f-a811-03e863a95efb\" (UID: \"80cd448a-eadd-444f-a811-03e863a95efb\") " Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.600066 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9g2q\" (UniqueName: \"kubernetes.io/projected/80cd448a-eadd-444f-a811-03e863a95efb-kube-api-access-k9g2q\") pod \"80cd448a-eadd-444f-a811-03e863a95efb\" (UID: \"80cd448a-eadd-444f-a811-03e863a95efb\") " Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.607923 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80cd448a-eadd-444f-a811-03e863a95efb-kube-api-access-k9g2q" (OuterVolumeSpecName: "kube-api-access-k9g2q") pod "80cd448a-eadd-444f-a811-03e863a95efb" (UID: "80cd448a-eadd-444f-a811-03e863a95efb"). InnerVolumeSpecName "kube-api-access-k9g2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.702405 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9g2q\" (UniqueName: \"kubernetes.io/projected/80cd448a-eadd-444f-a811-03e863a95efb-kube-api-access-k9g2q\") on node \"crc\" DevicePath \"\"" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.769174 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80cd448a-eadd-444f-a811-03e863a95efb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "80cd448a-eadd-444f-a811-03e863a95efb" (UID: "80cd448a-eadd-444f-a811-03e863a95efb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.804416 4717 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/80cd448a-eadd-444f-a811-03e863a95efb-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 16:03:41 crc kubenswrapper[4717]: I0217 16:03:41.864172 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80cd448a-eadd-444f-a811-03e863a95efb" path="/var/lib/kubelet/pods/80cd448a-eadd-444f-a811-03e863a95efb/volumes" Feb 17 16:03:50 crc kubenswrapper[4717]: I0217 16:03:50.809210 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:03:50 crc kubenswrapper[4717]: I0217 16:03:50.810028 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:04:13 crc kubenswrapper[4717]: I0217 16:04:13.491579 4717 scope.go:117] "RemoveContainer" containerID="11447aa9e8165114db83dbefa3055034e7d022d98cbc9f8c7e417d5364dbc577" Feb 17 16:04:20 crc kubenswrapper[4717]: I0217 16:04:20.809177 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:04:20 crc kubenswrapper[4717]: I0217 16:04:20.809930 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:04:50 crc kubenswrapper[4717]: I0217 16:04:50.808413 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:04:50 crc kubenswrapper[4717]: I0217 16:04:50.809094 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:04:50 crc kubenswrapper[4717]: I0217 16:04:50.809185 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 16:04:50 crc kubenswrapper[4717]: I0217 16:04:50.810102 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d1eca0fe5740b1a6f347e856332dd2fb03e726854f088ec4724d465bf64a29e"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 16:04:50 crc kubenswrapper[4717]: I0217 16:04:50.810201 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://6d1eca0fe5740b1a6f347e856332dd2fb03e726854f088ec4724d465bf64a29e" gracePeriod=600 Feb 17 16:04:51 crc kubenswrapper[4717]: I0217 16:04:51.258549 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="6d1eca0fe5740b1a6f347e856332dd2fb03e726854f088ec4724d465bf64a29e" exitCode=0 Feb 17 16:04:51 crc kubenswrapper[4717]: I0217 16:04:51.258628 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"6d1eca0fe5740b1a6f347e856332dd2fb03e726854f088ec4724d465bf64a29e"} Feb 17 16:04:51 crc kubenswrapper[4717]: I0217 16:04:51.259047 4717 scope.go:117] "RemoveContainer" containerID="ceff3af288f9bad20e7800f647d9834c95b54f3ccfeda57c040ac8ca2a521aa9" Feb 17 16:04:52 crc kubenswrapper[4717]: I0217 16:04:52.270170 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerStarted","Data":"3bdebf1c11658785bb91264fd7063e3b8fe7130b6de87ee4664ff63deb304b15"} Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.135041 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xnthn"] Feb 17 16:06:36 crc kubenswrapper[4717]: E0217 16:06:36.136115 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80cd448a-eadd-444f-a811-03e863a95efb" containerName="copy" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.136131 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="80cd448a-eadd-444f-a811-03e863a95efb" containerName="copy" Feb 17 16:06:36 crc kubenswrapper[4717]: E0217 16:06:36.136150 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd55824d-8ffd-43d9-9a49-7abb1169ed1b" containerName="extract-utilities" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.136160 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd55824d-8ffd-43d9-9a49-7abb1169ed1b" containerName="extract-utilities" Feb 17 16:06:36 crc kubenswrapper[4717]: E0217 16:06:36.136179 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd55824d-8ffd-43d9-9a49-7abb1169ed1b" containerName="extract-content" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.136188 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd55824d-8ffd-43d9-9a49-7abb1169ed1b" containerName="extract-content" Feb 17 16:06:36 crc kubenswrapper[4717]: E0217 16:06:36.136208 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80cd448a-eadd-444f-a811-03e863a95efb" containerName="gather" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.136216 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="80cd448a-eadd-444f-a811-03e863a95efb" containerName="gather" Feb 17 16:06:36 crc kubenswrapper[4717]: E0217 16:06:36.136240 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd55824d-8ffd-43d9-9a49-7abb1169ed1b" containerName="registry-server" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.136250 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd55824d-8ffd-43d9-9a49-7abb1169ed1b" containerName="registry-server" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.136520 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd55824d-8ffd-43d9-9a49-7abb1169ed1b" containerName="registry-server" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.136543 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="80cd448a-eadd-444f-a811-03e863a95efb" containerName="copy" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.136565 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="80cd448a-eadd-444f-a811-03e863a95efb" containerName="gather" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.138303 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.162682 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xnthn"] Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.235534 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8eeecab-932e-4ba9-b9db-751d107a52eb-utilities\") pod \"community-operators-xnthn\" (UID: \"b8eeecab-932e-4ba9-b9db-751d107a52eb\") " pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.236019 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vcc6\" (UniqueName: \"kubernetes.io/projected/b8eeecab-932e-4ba9-b9db-751d107a52eb-kube-api-access-8vcc6\") pod \"community-operators-xnthn\" (UID: \"b8eeecab-932e-4ba9-b9db-751d107a52eb\") " pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.236153 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8eeecab-932e-4ba9-b9db-751d107a52eb-catalog-content\") pod \"community-operators-xnthn\" (UID: \"b8eeecab-932e-4ba9-b9db-751d107a52eb\") " pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.338703 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vcc6\" (UniqueName: \"kubernetes.io/projected/b8eeecab-932e-4ba9-b9db-751d107a52eb-kube-api-access-8vcc6\") pod \"community-operators-xnthn\" (UID: \"b8eeecab-932e-4ba9-b9db-751d107a52eb\") " pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.338769 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8eeecab-932e-4ba9-b9db-751d107a52eb-catalog-content\") pod \"community-operators-xnthn\" (UID: \"b8eeecab-932e-4ba9-b9db-751d107a52eb\") " pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.338828 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8eeecab-932e-4ba9-b9db-751d107a52eb-utilities\") pod \"community-operators-xnthn\" (UID: \"b8eeecab-932e-4ba9-b9db-751d107a52eb\") " pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.339419 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8eeecab-932e-4ba9-b9db-751d107a52eb-utilities\") pod \"community-operators-xnthn\" (UID: \"b8eeecab-932e-4ba9-b9db-751d107a52eb\") " pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.339844 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8eeecab-932e-4ba9-b9db-751d107a52eb-catalog-content\") pod \"community-operators-xnthn\" (UID: \"b8eeecab-932e-4ba9-b9db-751d107a52eb\") " pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.375267 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vcc6\" (UniqueName: \"kubernetes.io/projected/b8eeecab-932e-4ba9-b9db-751d107a52eb-kube-api-access-8vcc6\") pod \"community-operators-xnthn\" (UID: \"b8eeecab-932e-4ba9-b9db-751d107a52eb\") " pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:36 crc kubenswrapper[4717]: I0217 16:06:36.509793 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:37 crc kubenswrapper[4717]: I0217 16:06:37.106030 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xnthn"] Feb 17 16:06:37 crc kubenswrapper[4717]: I0217 16:06:37.473310 4717 generic.go:334] "Generic (PLEG): container finished" podID="b8eeecab-932e-4ba9-b9db-751d107a52eb" containerID="b1d30056778e265afa35e3d6e373ec46274fdc3514f86f1a63656f28ab0a510d" exitCode=0 Feb 17 16:06:37 crc kubenswrapper[4717]: I0217 16:06:37.473356 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnthn" event={"ID":"b8eeecab-932e-4ba9-b9db-751d107a52eb","Type":"ContainerDied","Data":"b1d30056778e265afa35e3d6e373ec46274fdc3514f86f1a63656f28ab0a510d"} Feb 17 16:06:37 crc kubenswrapper[4717]: I0217 16:06:37.473386 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnthn" event={"ID":"b8eeecab-932e-4ba9-b9db-751d107a52eb","Type":"ContainerStarted","Data":"dd855f731f826d4d6d24d5e67a6b67f69dbf279d8c67215765c61aefa0286705"} Feb 17 16:06:37 crc kubenswrapper[4717]: I0217 16:06:37.481217 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 16:06:39 crc kubenswrapper[4717]: I0217 16:06:39.506491 4717 generic.go:334] "Generic (PLEG): container finished" podID="b8eeecab-932e-4ba9-b9db-751d107a52eb" containerID="51c7c968523a068e35bec7eacb6cde83c5ad8359fa0ad70618f24f6769e166b5" exitCode=0 Feb 17 16:06:39 crc kubenswrapper[4717]: I0217 16:06:39.506580 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnthn" event={"ID":"b8eeecab-932e-4ba9-b9db-751d107a52eb","Type":"ContainerDied","Data":"51c7c968523a068e35bec7eacb6cde83c5ad8359fa0ad70618f24f6769e166b5"} Feb 17 16:06:40 crc kubenswrapper[4717]: I0217 16:06:40.524242 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnthn" event={"ID":"b8eeecab-932e-4ba9-b9db-751d107a52eb","Type":"ContainerStarted","Data":"a0d73326e87571078d72033670bfb7a8a4ab652c44812b877413ccdc52354fc3"} Feb 17 16:06:40 crc kubenswrapper[4717]: I0217 16:06:40.555015 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xnthn" podStartSLOduration=2.068360881 podStartE2EDuration="4.554982096s" podCreationTimestamp="2026-02-17 16:06:36 +0000 UTC" firstStartedPulling="2026-02-17 16:06:37.480729046 +0000 UTC m=+4463.896569542" lastFinishedPulling="2026-02-17 16:06:39.967350261 +0000 UTC m=+4466.383190757" observedRunningTime="2026-02-17 16:06:40.546500927 +0000 UTC m=+4466.962341473" watchObservedRunningTime="2026-02-17 16:06:40.554982096 +0000 UTC m=+4466.970822602" Feb 17 16:06:46 crc kubenswrapper[4717]: I0217 16:06:46.510356 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:46 crc kubenswrapper[4717]: I0217 16:06:46.512649 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:46 crc kubenswrapper[4717]: I0217 16:06:46.583538 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:46 crc kubenswrapper[4717]: I0217 16:06:46.659849 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:50 crc kubenswrapper[4717]: I0217 16:06:50.317755 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xnthn"] Feb 17 16:06:50 crc kubenswrapper[4717]: I0217 16:06:50.319707 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xnthn" podUID="b8eeecab-932e-4ba9-b9db-751d107a52eb" containerName="registry-server" containerID="cri-o://a0d73326e87571078d72033670bfb7a8a4ab652c44812b877413ccdc52354fc3" gracePeriod=2 Feb 17 16:06:50 crc kubenswrapper[4717]: I0217 16:06:50.633717 4717 generic.go:334] "Generic (PLEG): container finished" podID="b8eeecab-932e-4ba9-b9db-751d107a52eb" containerID="a0d73326e87571078d72033670bfb7a8a4ab652c44812b877413ccdc52354fc3" exitCode=0 Feb 17 16:06:50 crc kubenswrapper[4717]: I0217 16:06:50.633759 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnthn" event={"ID":"b8eeecab-932e-4ba9-b9db-751d107a52eb","Type":"ContainerDied","Data":"a0d73326e87571078d72033670bfb7a8a4ab652c44812b877413ccdc52354fc3"} Feb 17 16:06:50 crc kubenswrapper[4717]: I0217 16:06:50.792346 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:50 crc kubenswrapper[4717]: I0217 16:06:50.986340 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8eeecab-932e-4ba9-b9db-751d107a52eb-utilities\") pod \"b8eeecab-932e-4ba9-b9db-751d107a52eb\" (UID: \"b8eeecab-932e-4ba9-b9db-751d107a52eb\") " Feb 17 16:06:50 crc kubenswrapper[4717]: I0217 16:06:50.986544 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vcc6\" (UniqueName: \"kubernetes.io/projected/b8eeecab-932e-4ba9-b9db-751d107a52eb-kube-api-access-8vcc6\") pod \"b8eeecab-932e-4ba9-b9db-751d107a52eb\" (UID: \"b8eeecab-932e-4ba9-b9db-751d107a52eb\") " Feb 17 16:06:50 crc kubenswrapper[4717]: I0217 16:06:50.986702 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8eeecab-932e-4ba9-b9db-751d107a52eb-catalog-content\") pod \"b8eeecab-932e-4ba9-b9db-751d107a52eb\" (UID: \"b8eeecab-932e-4ba9-b9db-751d107a52eb\") " Feb 17 16:06:50 crc kubenswrapper[4717]: I0217 16:06:50.988287 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8eeecab-932e-4ba9-b9db-751d107a52eb-utilities" (OuterVolumeSpecName: "utilities") pod "b8eeecab-932e-4ba9-b9db-751d107a52eb" (UID: "b8eeecab-932e-4ba9-b9db-751d107a52eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:06:50 crc kubenswrapper[4717]: I0217 16:06:50.996985 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8eeecab-932e-4ba9-b9db-751d107a52eb-kube-api-access-8vcc6" (OuterVolumeSpecName: "kube-api-access-8vcc6") pod "b8eeecab-932e-4ba9-b9db-751d107a52eb" (UID: "b8eeecab-932e-4ba9-b9db-751d107a52eb"). InnerVolumeSpecName "kube-api-access-8vcc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:06:51 crc kubenswrapper[4717]: I0217 16:06:51.058388 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8eeecab-932e-4ba9-b9db-751d107a52eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8eeecab-932e-4ba9-b9db-751d107a52eb" (UID: "b8eeecab-932e-4ba9-b9db-751d107a52eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:06:51 crc kubenswrapper[4717]: I0217 16:06:51.090605 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8eeecab-932e-4ba9-b9db-751d107a52eb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:06:51 crc kubenswrapper[4717]: I0217 16:06:51.090667 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8eeecab-932e-4ba9-b9db-751d107a52eb-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:06:51 crc kubenswrapper[4717]: I0217 16:06:51.090739 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vcc6\" (UniqueName: \"kubernetes.io/projected/b8eeecab-932e-4ba9-b9db-751d107a52eb-kube-api-access-8vcc6\") on node \"crc\" DevicePath \"\"" Feb 17 16:06:51 crc kubenswrapper[4717]: I0217 16:06:51.649740 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnthn" event={"ID":"b8eeecab-932e-4ba9-b9db-751d107a52eb","Type":"ContainerDied","Data":"dd855f731f826d4d6d24d5e67a6b67f69dbf279d8c67215765c61aefa0286705"} Feb 17 16:06:51 crc kubenswrapper[4717]: I0217 16:06:51.649814 4717 scope.go:117] "RemoveContainer" containerID="a0d73326e87571078d72033670bfb7a8a4ab652c44812b877413ccdc52354fc3" Feb 17 16:06:51 crc kubenswrapper[4717]: I0217 16:06:51.649878 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnthn" Feb 17 16:06:51 crc kubenswrapper[4717]: I0217 16:06:51.683267 4717 scope.go:117] "RemoveContainer" containerID="51c7c968523a068e35bec7eacb6cde83c5ad8359fa0ad70618f24f6769e166b5" Feb 17 16:06:51 crc kubenswrapper[4717]: I0217 16:06:51.718739 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xnthn"] Feb 17 16:06:51 crc kubenswrapper[4717]: I0217 16:06:51.727559 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xnthn"] Feb 17 16:06:51 crc kubenswrapper[4717]: I0217 16:06:51.730523 4717 scope.go:117] "RemoveContainer" containerID="b1d30056778e265afa35e3d6e373ec46274fdc3514f86f1a63656f28ab0a510d" Feb 17 16:06:51 crc kubenswrapper[4717]: I0217 16:06:51.862334 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8eeecab-932e-4ba9-b9db-751d107a52eb" path="/var/lib/kubelet/pods/b8eeecab-932e-4ba9-b9db-751d107a52eb/volumes" Feb 17 16:07:20 crc kubenswrapper[4717]: I0217 16:07:20.808238 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:07:20 crc kubenswrapper[4717]: I0217 16:07:20.809186 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:07:50 crc kubenswrapper[4717]: I0217 16:07:50.808741 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:07:50 crc kubenswrapper[4717]: I0217 16:07:50.809850 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:08:20 crc kubenswrapper[4717]: I0217 16:08:20.808772 4717 patch_prober.go:28] interesting pod/machine-config-daemon-dtt4m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:08:20 crc kubenswrapper[4717]: I0217 16:08:20.809273 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:08:20 crc kubenswrapper[4717]: I0217 16:08:20.809316 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" Feb 17 16:08:20 crc kubenswrapper[4717]: I0217 16:08:20.809945 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3bdebf1c11658785bb91264fd7063e3b8fe7130b6de87ee4664ff63deb304b15"} pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 16:08:20 crc kubenswrapper[4717]: I0217 16:08:20.809989 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerName="machine-config-daemon" containerID="cri-o://3bdebf1c11658785bb91264fd7063e3b8fe7130b6de87ee4664ff63deb304b15" gracePeriod=600 Feb 17 16:08:20 crc kubenswrapper[4717]: E0217 16:08:20.941959 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" Feb 17 16:08:21 crc kubenswrapper[4717]: I0217 16:08:21.603979 4717 generic.go:334] "Generic (PLEG): container finished" podID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9" containerID="3bdebf1c11658785bb91264fd7063e3b8fe7130b6de87ee4664ff63deb304b15" exitCode=0 Feb 17 16:08:21 crc kubenswrapper[4717]: I0217 16:08:21.604060 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" event={"ID":"7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9","Type":"ContainerDied","Data":"3bdebf1c11658785bb91264fd7063e3b8fe7130b6de87ee4664ff63deb304b15"} Feb 17 16:08:21 crc kubenswrapper[4717]: I0217 16:08:21.604335 4717 scope.go:117] "RemoveContainer" containerID="6d1eca0fe5740b1a6f347e856332dd2fb03e726854f088ec4724d465bf64a29e" Feb 17 16:08:21 crc kubenswrapper[4717]: I0217 16:08:21.605056 4717 scope.go:117] "RemoveContainer" containerID="3bdebf1c11658785bb91264fd7063e3b8fe7130b6de87ee4664ff63deb304b15" Feb 17 16:08:21 crc kubenswrapper[4717]: E0217 16:08:21.605524 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dtt4m_openshift-machine-config-operator(7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dtt4m" podUID="7bdad0fc-dc29-4d62-8de2-0aa8c07c73b9"